< Back to previous page

Project

Taming Nonconvexity in Structured Low-Rank Optimization.

Recent advances have made possible the acquisition, storing, and processing of very large amounts of data, strongly impacting many branches of science and engineering. A way to interpret these data and explore their features is to use techniques that represent them as certain low-dimensional objects that allow for an intuitive interpretation by domain-specific experts. Such techniques typically factorize the data as two or more structured objects—e.g., orthogonal matrices, sparse tensors—with a lower rank than the original data. The factorizations can usually be formulated as solutions to large-scale nonconvex optimization problems; it is of interest to develop fast algorithms to solve them and, in particular, algorithms for which one can prove that they always converge to useful solutions. One cannot enforce this guarantee in many methods; however, we argue that recent developments made by the applicants on Bregman proximal envelopes and on block and multi-block relative smooth functions are excellent tools to develop such algorithms. In short, this project aims (i) to introduce and study a very general formulation for nonsmooth, structured low-rank optimization, (ii) to establish conditions under which this formulation is tractable (even if nonconvex), (iii) to design provably convergent algorithms to address it, and (iv) to apply and test the new model and algorithms in problems from two domains: image processing and genomic analysis.
Date:1 Jan 2022 →  Today
Keywords:NONCONVEX OPTIMIZATION, NONSMOOTH OPTIMIZATION
Disciplines:Operations research and mathematical programming, Control systems, robotics and automation not elsewhere classified, Signal processing, Numerical computation, Data visualisation and imaging
Project type:Collaboration project