Title: Communication Efficient Distributed and Parallel Optimization: Convergence, Trade-offs, and Fundamental Limits.
Abstract: Large-scale distributed and parallel optimization algorithms have been instrumental in recent advancements in many areas such as machine learning and in the operation of large-scale cyperphysical systems. In these algorithms multiple nodes or processors coordinate local information to solve large optimization problems. To do this, important algorithm information must be quantized to bits so it can be communicated.
This talk introduces some recent work on the convergence and convergence rates limited communication distributed optimization algorithms. We explore minimal compressions and minimal number of communicated bits needed to solve distributed optimization problems. We also illustrate how the convergence rates of general optimization algorithms can be maintained under finite bit compression. All of our convergence rate results establish direct trade-offs between convergence rate and bits communicated per iteration.