Event

Last Fridays Talks: Learning Theory and Optimization

Featured image

Location

Date

Type

Organizer

Last Fridays Talks 

Each last-Friday-of-the-month, we are hosting the Last Fridays Talks, where one of our seven Collaboratories will present insights from their current work. Join us for a discussion on results and recent papers, followed by some socializing afterwards for everyone who wish to attend. 

Sign-up to join in person or online.

 

Title

The Sample Complexity of ERMs in Stochastic Convex Optimization

 

Abstract 

Stochastic convex optimization is one of the most well-studied models for learning in modern machine learning. Nevertheless, a central fundamental question in this setup remained unresolved: “How many data points must be observed so that any Empirical Risk Minimizer shows good performance on the true population?” This question was proposed by Feldman (2016), who proved that Ω(dϵ+1ϵ2) data points are necessary (where d is the dimension and ϵ>0 is the accuracy parameter). Proving an ω(dϵ+1ϵ2) lower bound was left as an open problem. In this work we show that in fact Õ(dϵ+1ϵ2) data points are also sufficient. This settles the question and yields a new separation between ERMs and uniform convergence. The proof builds a mechanism for controlling the behavior of stochastic convex optimization problems.

Based on work with Carmon and Livni.

 

Speaker

Amir Yehudayoff

 

Read more about the Collaboratory Learning Theory and Optimization, here.