<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=UTF-8">
</head>
<body text="#000000" bgcolor="#FFFFFF">
<br>
<div class="moz-forward-container"><br>
<div dir="ltr">A talk of possible interest<br>
<br>
<div class="gmail_quote">
<div dir="ltr" class="gmail_attr">---------- Forwarded message
---------<br>
From: <strong class="gmail_sendername" dir="auto">Gina M.
Holland</strong> <span dir="ltr"><<a
href="mailto:gholland@princeton.edu"
moz-do-not-send="true">gholland@princeton.edu</a>></span><br>
Date: Wed, 27 Feb 2019 at 11:28<br>
Subject: NEXT: PACM COLLOQUIUM - Monday, March 4 - Shay
Moran (Princeton University)<br>
To: <a href="mailto:pacmdistribution@math.princeton.edu"
moz-do-not-send="true">pacmdistribution@math.princeton.edu</a>
<<a href="mailto:pacmdistribution@math.princeton.edu"
moz-do-not-send="true">pacmdistribution@math.princeton.edu</a>><br>
</div>
<br>
<br>
<div link="blue" vlink="purple" lang="EN-US">
<div class="m_4967748544967758751WordSection1">
<p class="MsoNormal"><span
style="font-size:12.0pt;font-family:"Times New
Roman",serif">The next PACM Colloquium will be
given by
<b>Shay Moran (Princeton University)</b></span><b><span
style="font-size:9.0pt;font-family:"Times New
Roman",serif"></span></b></p>
<p class="MsoNormal"><span
style="font-size:12.0pt;font-family:"Times New
Roman",serif"><br>
=================<br>
<b>DATE/LOCATION: Monday, March 4 2019, 4:00 – 5:00 PM
/ 214 Fine Hall</b><br>
<br>
SPEAKER: <b>Shay Moran (Princeton University)</b></span><b><span
style="font-size:9.0pt;font-family:"Times New
Roman",serif"></span></b></p>
<p class="MsoNormal"><span
style="font-size:12.0pt;font-family:"Times New
Roman",serif"><br>
TITLE: </span><b><span
style="font-size:12.0pt;font-family:"Times New
Roman",serif">The Optimal Approximation Factor
in Density Estimation</span></b><span
style="font-size:12.0pt;font-family:"Times New
Roman",serif"></span></p>
<p class="MsoNormal"><span
style="font-size:12.0pt;font-family:"Times New
Roman",serif"> </span></p>
<p class="MsoNormal"><span
style="font-size:12.0pt;font-family:"Times New
Roman",serif">ABSTRACT:</span> </p>
<p class="MsoNormal"> </p>
<div style="border:none;border-bottom:double windowtext
2.25pt;padding:0in 0in 1.0pt 0in">
<p class="MsoNormal"><span
style="font-size:12.0pt;font-family:"Times New
Roman",serif">Consider the following problem:
given arbitrary densities q1,q2 and a sample-access
to an unknown target density p, find which of the
qi's is closer to p in the total variation distance.
</span></p>
<p class="MsoNormal"><span
style="font-size:8.0pt;font-family:"Times New
Roman",serif"> </span></p>
<p class="MsoNormal"><span
style="font-size:12.0pt;font-family:"Times New
Roman",serif">A beautiful (and simple) result
due to Yatracos (1985) shows that this problem is
tractable in the following sense: there exists an
algorithm that uses O(epsilon^{-2}) samples from p
and outputs qi such that with high
probability, TV(qi,p) <= 3*OPT + epsilon, where
OPT= min{TV(q1,p),TV(q2,p)}. Moreover, this result
extends to any finite class of densities: there
exists an algorithm that outputs the best density in
Q up to a multiplicative approximation factor of 3.</span></p>
<p class="MsoNormal"><span
style="font-size:8.0pt;font-family:"Times New
Roman",serif"> </span></p>
<p class="MsoNormal"><span
style="font-size:12.0pt;font-family:"Times New
Roman",serif">We complement and extend this
result by showing that: (i) the factor 3 cannot be
improved if one restricts the algorithm to output a
density from Q, and (ii) if one allows the algorithm
to output arbitrary densities (e.g. a mixture of
densities from Q), then the approximation factor can
be reduced to 2, which is optimal. In particular
this demonstrates an advantage of improper learning
over proper in this setup.</span></p>
<p class="MsoNormal"><span
style="font-size:8.0pt;font-family:"Times New
Roman",serif"> </span></p>
<p class="MsoNormal"><span
style="font-size:12.0pt;font-family:"Times New
Roman",serif">Our algorithms rely on
estimating carefully chosen surrogates metrics to
the total variation, and our sample complexity
bounds exploit techniques from Adaptive Data
Analysis.</span></p>
<p class="MsoNormal"><span
style="font-size:12.0pt;font-family:"Times New
Roman",serif">Joint work with Olivier Bousquet
(Google brain) and Daniel Kane (UCSD).</span></p>
<p class="MsoNormal"><i><span
style="font-size:12.0pt;font-family:"Times
New Roman",serif;color:#1f497d"> </span></i></p>
<p class="MsoNormal"><i><span
style="font-size:12.0pt;font-family:"Times
New Roman",serif;color:#7f7f7f">Shay Moran is
a Postdoctoral fellow at the Computer Science
Department in Princeton University. He graduated
from the Technion in September ’16. During 2017 he
was a postdoctoral fellow at UCSD and at the
Simons Institute in Berkeley. During 2018 he was a
member at the Institute for Advanced Study. In
October ’19 he will join the Math Department at
the Technion as an assistant Professor. Shay’s
research interests revolves around mathematical
problems that arise in computer science, with a
focus on combinatorial problems related to machine
learning.</span></i></p>
<p class="MsoNormal"><i> </i></p>
</div>
<p class="MsoNormal"><i><span style="color:#1f497d"> </span></i></p>
<p class="MsoNormal"><span style="color:#1f497d"><img
style="width:1.0in;height:1.0in"
id="m_4967748544967758751Picture_x0020_1"
src="cid:part4.EF3808C6.5D1CE910@math.ias.edu"
alt="Macintosh HD:Users:theresa:Desktop:PACM
logo:FINAL LOGO:PACM-logo.png" class="" width="96"
height="96"></span></p>
<p class="MsoNormal"><span style="color:#1f497d"><img
style="width:1.3333in;height:.5in"
id="m_4967748544967758751Picture_x0020_2"
src="cid:part5.499A4B18.51F59690@math.ias.edu"
alt="Macintosh
HD:Users:theresa:Downloads:PUsig2-pkg:Links:Black-and-white:Black:PUsig2-bw-bs.pdf"
class="" width="128" height="48"></span></p>
<p class="MsoNormal"><span style="color:#1f497d"> </span></p>
<p class="MsoNormal"> </p>
<p class="MsoNormal"><span style="color:#1f497d"> </span></p>
</div>
</div>
</div>
</div>
</div>
</body>
</html>