<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=utf-8">
</head>
<body text="#000000" bgcolor="#FFFFFF">
<pre>INSTITUTE FOR ADVANCED STUDY
School of Mathematics
Princeton, NJ 08540
Theoretical Computer Science/Discrete Math Seminars
Week of October 2, 2017
--------------
To view mathematics in titles and abstracts, please click on the talk's link.
--------------
Monday, October 2
Computer Science/Discrete Mathematics Seminar I
Topic:                 Crossing the logarithmic barrier for dynamic boolean data structure lower bounds
Speaker:         Omri Weinstein, Columbia University
Time/Room:         11:00am - 12:15pm/S-101
Abstract Link:        <a href="http://www.math.ias.edu/seminars/abstract?event=128772">http://www.math.ias.edu/seminars/abstract?event=128772</a>
Monday, October 2
Seminar on Theoretical Machine Learning
Topic:                 Hyperparameter optimization: a spectral approach
Speaker:         Elad Hazan, Princeton University
Time/Room:         12:30pm - 1:45pm/White-Levy Room
Abstract Link:        <a href="http://www.math.ias.edu/seminars/abstract?event=133015">http://www.math.ias.edu/seminars/abstract?event=133015</a>
Tuesday, October 3
Computer Science/Discrete Mathematics Seminar II
Topic:                 Elementary open problems in Algebra (with consequences in computational complexity)
Speaker:         Avi Wigderson, Herbert H. Maass Professor, School of Mathematics
Time/Room:         10:30am - 12:30pm/S-101
Abstract Link:        <a href="http://www.math.ias.edu/seminars/abstract?event=129001">http://www.math.ias.edu/seminars/abstract?event=129001</a>
</pre>
1 Crossing the logarithmic barrier for dynamic boolean data
structure lower bounds
<br>
Omri Weinstein
<br>
<br>
<p>This paper proves the first super-logarithmic lower bounds on the
cell-probe complexity of dynamic boolean (a.k.a. decision) data
structure problems, a long-standing milestone in data structure
lower bounds.<br>
<br>
We introduce a new method for proving dynamic cell probe lower
bounds and use it to prove a $\tilde{\Omega}(\log^{1.5} n)$ lower
bound on the operational time of a wide range of boolean data
structure problems, most notably, on the query time of dynamic
range counting over $\mathbb{F}_2$ ([Pat07]). Proving an
$\omega(\lg n)$ lower bound for this problem was explicitly posed
as one of five important open problems in the late Mihai
Patrascu's obituary. This result also implies the first
$\omega(\lg n)$ lower bound for the classical 2D range counting
problem, one of the most fundamental data structure problems in
computational geometry and spatial databases. We derive similar
lower bounds for boolean versions of dynamic polynomial evaluation
and 2D rectangle stabbing, and for the (non-boolean) problems of
range selection and range median.</p>
<p>Our technical centerpiece is a new way of ``weakly" simulating
dynamic data structures using efficient one-way communication
protocols with small advantage over random guessing. This
simulation involves a surprising excursion to low-degree
(Chebyshev) polynomials which may be of independent interest, and
offers an entirely new algorithmic angle on the ``cell sampling"
method.</p>
<p>Joint work with Kasper Green Larsen and Huacheng Yu.</p>
<a href="http://www.math.ias.edu/seminars/abstract?event=128772">http://www.math.ias.edu/seminars/abstract?event=128772</a><br>
<br>
2 Hyperparameter optimization: a spectral approach
<br>
Elad Hazan
<br>
<br>
<p>Modern machine learning algorithms often involve complicated
models with tunable parameters, called hyperparameters, that are
set during training. Hyperparameter tuning/optimization is
considered an art. (See e.g. <a>http://www.argmin.net/2016/06/20/hypertuning/</a>
)</p>
<p>We give a simple, fast algorithm for hyperparameter optimization
inspired by techniques from the analysis of Boolean functions. We
focus on the high-dimensional regime where the canonical example
is training a neural network with a large number of
hyperparameters. The algorithm - an iterative application of
compressed sensing techniques for orthogonal polynomials -
requires only uniform sampling of the hyperparameters and is thus
easily parallelizable. Experiments for training deep nets on
Cifar-10 show that compared to state-of-the-art tools (e.g.,
Hyperband and Spearmint), our algorithm finds significantly
improved solutions. Additionally, our method comes with provable
guarantees and yields a quasi-polynomial time algorithm for
learning decision trees under the uniform distribution with the
best known sample complexity.</p>
<p>The talk covers this paper paper: <a>https://arxiv.org/abs/1706.00764</a>
as well as work in progress.</p>
<a href="http://www.math.ias.edu/seminars/abstract?event=133015">http://www.math.ias.edu/seminars/abstract?event=133015</a><br>
<br>
3 Elementary open problems in Algebra (with consequences in
computational complexity)
<br>
Avi Wigderson
<br>
<br>
<p>I will survey some elementary (to state!) problems on groups,
matrices, and tensors, and discuss their motivations arising from
several major problems in computational complexity theory. On each
problem there was some exciting recent progress which may raise
hope it can be resolved. No special background will be assumed.
</p>
<a href="http://www.math.ias.edu/seminars/abstract?event=129001">http://www.math.ias.edu/seminars/abstract?event=129001</a><br>
<br>
Computer Science/Discrete Math Seminars can be found on our web
page:<br>
<br>
<a href="http://www.math.ias.edu/csdm">http://www.math.ias.edu/csdm</a><br>
<a href="http://www.math.ias.edu">http://www.math.ias.edu</a>
</body>
</html>