Fast algorithms for approximately solving large convex problems and applications
We propose in this research to extend and to study
theoretical and practical aspects of some fast optimization schemes
recently proposed for large-scale linear optimization problems. In
direct continuation to the work of Dr. Fabian Chudak and Vania Dos
Santos Eleuterio, we plan to investigate extensions of these schemes to
semidefinite optimization problems and to convex optimization problems
with a stochastic component.
In order to apply these optimization algorithms, namely smoothing techniques,
to semidefinite and symmetric optimization problems, several questions
need to be explored. From a theoretical point of view, different
smoothing strategies can be envisaged, from which we want to obtain
convergence and speed guarantees. Of particular interest are issues
involving the robustness of these schemes with respect to approximative
solutions of subproblems, and/or to inexact oracle information. Also,
several numerical questions need to be investigated: the advantage of
dealing with almost low-rank matrices can be exploited, and the
utilization of the possible sparsity of the data must be addressed. Our
research is essentially applications oriented. For instance, we focus on
semidefinite relaxations of hard combinatorial problems and largest
eigenvalue minimization arising in network applications.
We study extensions of our methods to some structured
stochastic optimization problems where the convex objective function
includes the expected value of a random variable. Besides the
convergence and speed aspects, we are highly interested in the
probabilistic and statistical properties of the approximate solution
delivered by our algorithm. For practical applications, we will focus on
portfolio optimization problems involving the Conditional Value-at-Risk
as risk measure.
We intend to contribute to other research efforts inside
ETH related to the solution of large-scale structured convex
optimization problems.