< Back to previous page

Publication

Distributed Proximal Algorithms for Large-Scale Structured Optimization

Book - Dissertation

Efficient first-order algorithms for large-scale distributed optimization is the main subject of investigation in this thesis. The algorithms considered cover a wide array of applications in machine learning, signal processing and control. In recent years, a large number of algorithms have been introduced that rely on (possibly a reformulation of) one of the classical splitting algorithms, specifically forward-backward, Douglas-Rachford and forward-backward-forward splittings. In this thesis a new three term splitting technique is developed that recovers forward-backward and Douglas-Rachford splittings as special cases. In the context of structured optimization, this splitting is leveraged to develop a framework for a large class of primal-dual algorithms providing a unified convergence analysis for many seemingly unrelated algorithms. Moreover, linear convergence is established for all such algorithms under mild regularity conditions for the cost functions.As another notable contribution we propose a randomized block-coordinate primal-dual algorithm that leads to a fully distributed asynchronous algorithm in a multi-agent model. Moreover, when specializing to multi-agent structured optimization over graphs, novel algorithms are proposed. In addition, it is shown that in a multi-agent model bounded communication delays are tolerated by primal-dual algorithms provided that certain strong convexity assumptions hold. In the final chapter we depart from convex analysis and consider a fully nonconvex block-coordinate proximal gradient algorithm and show that it leads to nonconvex incremental aggregated algorithms for regularized finite sum and sharing problems with very general sampling strategies.
Publication year:2020