[CSDM] Fwd: [Theory-Read] Meet Gregory Valiant this Friday!
Avi Wigderson
avi at ias.edu
Tue Oct 15 16:12:23 EDT 2019
Hi all,
Perhaps the last mailing list I'll bother you with
theory-read <theory-read at lists.cs.princeton.edu>
that I am sure most of you subscribe to already (which in particular
informs of teh PU theore seminar below, but other local PU TCS stuff as
well). And again, Greg Valiant is excellent person to hear (and they
serve lunch...)
Best,
Avi
-------- Forwarded Message --------
Subject: [Theory-Read] Meet Gregory Valiant this Friday!
Date: Tue, 15 Oct 2019 15:24:24 -0400 (EDT)
From: Yufei Zheng <yufei at cs.princeton.edu>
To: theory-read <theory-read at lists.cs.princeton.edu>
Hello everyone,
From this week on I'll take over part of the theory lunch scheduling
from Raghuvansh.
We have Gregory Valiant coming this week for theory lunch! He is around
and will love to meet people. Slots are available from 9am-6pm on
Friday. Please let me know what times work for you. We might have a
student session if there's interest.
Additionally, Gregory would be around for dinner on Friday too! Let me
know if you're interested.
Best,
Yufei Zheng
----- Forwarded Message -----
From: "Raghuvansh R. Saxena" <rrsaxena at cs.princeton.edu>
To: "theory-read" <theory-read at lists.cs.princeton.edu>
Cc: "valiant" <valiant at stanford.edu>
Sent: Tuesday, October 15, 2019 3:00:00 PM
Subject: [Theory-Read] Gregory Valiant @ Theory Lunch this Friday!
Hello everyone,
We are delighted to have Gregory Valiant @ Theory Lunch this Friday!
The food will be served at 11:45 am, and the talk will start at 12 pm.
Please come early for lunch so that the talk starts on time.
Location of the talk: 194 Nassau, Suite 21 lobby (outside room 251).
Location of the lunch: 194 Nassau, Suite 22 Kitchen (outside room 213).
The details of the talk are below.
See you there, Raghuvansh
Title: New Problems and Perspectives on Learning, Sampling, and Memory,
in the Small Data Regime
Abstract: I will discuss several new problems related to the general
challenge of understanding what conclusions can be made, given a dataset
that is relatively small in comparison to the complexity or
dimensionality of the underlying distribution from which it is drawn. In
the first setting we consider the problem of learning a population of
Bernoulli (or multinomial) parameters. This is motivated by the
``federated learning" setting where we have data from a large number of
heterogeneous individuals, who each supply a very modest amount of data,
and ask the extent to which the number of data sources can compensate
for the lack of data from each source. Second, I will introduce the
problem of data "amplification". Given n independent draws from a
distribution, D, to what extent is it possible to output a set of m > n
datapoints that are indistinguishable from m i.i.d. draws from D?
Curiously, we show that nontrivial amplification is often possible in
the regime where n is too small to learn D to any nontrivial accuracy.
We also discuss connections between this setting and the challenge of
interpreting the behavior of GANs and other ML/AI systems. Finally (if
there is time), I will also discuss memory/data tradeoffs for
regression, with the punchline that any algorithm that uses a
subquadratic amount of memory will require asymptotically more data than
second-order methods to achieve comparable accuracy. This talk is based
on four joint papers with various subsets of Weihao Kong, Brian Axelrod,
Shivam Garg, Vatsal Sharan, Aaron Sidford, Sham Kakade, and Ramya Vinayak.
_______________________________________________
Theory-Read mailing list
Theory-Read at lists.cs.princeton.edu
https://lists.cs.princeton.edu/mailman/listinfo/theory-read
_______________________________________________
Theory-Read mailing list
Theory-Read at lists.cs.princeton.edu
https://lists.cs.princeton.edu/mailman/listinfo/theory-read
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://imap.math.ias.edu/pipermail/csdm/attachments/20191015/18c0c907/attachment.html>
More information about the csdm
mailing list