Files
Abstract
Stochastic optimization algorithms have become indispensable in modern machine learning. The developments of theories and algorithms of modern optimization also requires the application of tools from different methematical branches, such as algebraic and differential geometry. In this dissertation, we answer several problems in stochastic optimization by a wide range of tools. We disprove the noncommutative arithmetic and geometric mean inequality using results from noncommutative polynomial optimization. We propose new, simpler and efficient models and algorithms for optimization over Grassmannian and flag manifolds. We study the problem of statistical inference in gradient-free optimization and contextual bandit optimization, and prove central limit theorems to construct confidence intervals. We present several versions of the Grothendieck inequality over the skew field of quaternions.