You are here

Upcoming Seminars

ThursdayJul 07, 202213:30
Geometric Functional Analysis and Probability SeminarRoom 155
Speaker:Oren LouidorTitle:TBDAbstract:opens in new windowin html    pdfopens in new window
ThursdayJul 07, 202213:30
Geometric Functional Analysis and Probability SeminarRoom 155
Speaker:Oren LouidorTitle:A limit law for the most favorite point of a simple random walk on a regular treeAbstract:opens in new windowin html    pdfopens in new window
We consider a continuous-time random walk on a regular tree of finite depth and study its favorite points among the leaf vertices. We prove that, for the walk started from a leaf vertex and stopped upon hitting the root, as the depth of the tree tends to infinity the maximal time spent at any leaf converges, under suitable scaling and centering, to a randomly-shifted Gumbel law. The random shift is characterized using a derivative-martingale like object associated with square-root local-time process on the tree. Joint work with Marek Biskup (UCLA).
MondayJul 18, 202211:15
Foundations of Computer Science SeminarRoom 155
Speaker:Omri Weinstein Title:How Many Linear Equations are Required to Solve a Linear Program? Abstract:opens in new windowin html    pdfopens in new window

Linear Programming is the backbone of algorithm design and combinatorial optimization.

The fastest known algorithms for solving linear programs, both in theory and practice, are

based on Interior-Point Methods: The core idea behind IPMs is to iteratively use Newton

approximation to reduce a linear program to a sequence of ~\sqrt{n} linear systems. Whether

this number of iterations can be improved is one of the major open problems in convex optimization.

A long line of work has shown that \sqrt{n} is indeed optimal for the special class of *self-concordant*

Barrier IPMs [Nestrov-Nemirovski94], but for general IPMs very little is known.

We propose a purely information-theoretic query model for studying the rate of convergence of IPMs,

via linear-system queries: Each iteration of the algorithm can adaptively specify an arbitrary diagonal

matrix H (an ellipsoid) and a vector v, and the oracle returns the least-squares minimizer of the linear

system \arg\min_x || Ax - v||_H. We show this model captures all known (deterministic) IPMs. Our main

result is an \Omega(\sqrt{n}) lower bound on the iterations of any deterministic algorithm in this model,

albeit for an (exponentially) ill-conditioned LP. In this talk I will describe this progress, assuming no prior

knowledge on IPMs.

Based on Joint work with Shunhua Jiang, Rasmus Kyng and Ming Ding.