In this thesis, distributed algorithms for estimation and optimization are studied. Centralized optimization algorithms are reviewed, including both iterative descent algorithms and approximation algorithms, starting from classical gradient descent methods to the alternating directions method of multipliers (ADMM). Then, distributed algorithms, which parallels the previous centralized ones, are reviewed. Two specific problems are tackled by means of distributed algorithms: a convex feasibility problem with an infinite number of constraint sets, which typically arise in set membership estimation, and the extension of the Method of Multipliers to a distributed setup, aimed at devising a distributed algorithm for constrained nonconvex optimization problems. Three distributed algorithms are proposed for set membership estimation problems and convergence results are provided for all of them. The first two algorithms require synchronous communication, while the third one can handle asynchronous communication protocols. Regarding constrained nonconvex optimization problems, a fully distributed asynchronous communication algorithm (called ASYMM) is presented and its convergence is shown under suitable assumptions. The proposed original algorithms are applied to several problems: linear regression, source localization, nonlinear classification and digit recognition and document classification in a Learning from Constraints framework.
Farina, F. (2019). Distributed Algorithms for Set Membership Estimation and Constrained Nonconvex Optimization.
Distributed Algorithms for Set Membership Estimation and Constrained Nonconvex Optimization
Francesco Farina
2019-01-01
Abstract
In this thesis, distributed algorithms for estimation and optimization are studied. Centralized optimization algorithms are reviewed, including both iterative descent algorithms and approximation algorithms, starting from classical gradient descent methods to the alternating directions method of multipliers (ADMM). Then, distributed algorithms, which parallels the previous centralized ones, are reviewed. Two specific problems are tackled by means of distributed algorithms: a convex feasibility problem with an infinite number of constraint sets, which typically arise in set membership estimation, and the extension of the Method of Multipliers to a distributed setup, aimed at devising a distributed algorithm for constrained nonconvex optimization problems. Three distributed algorithms are proposed for set membership estimation problems and convergence results are provided for all of them. The first two algorithms require synchronous communication, while the third one can handle asynchronous communication protocols. Regarding constrained nonconvex optimization problems, a fully distributed asynchronous communication algorithm (called ASYMM) is presented and its convergence is shown under suitable assumptions. The proposed original algorithms are applied to several problems: linear regression, source localization, nonlinear classification and digit recognition and document classification in a Learning from Constraints framework.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11365/1071508
Attenzione
Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo