[iasmath-seminars] Additional Seminar on TML at Princeton University--Wednesday, March 6
Kristina Phillips
kphillips at ias.edu
Tue Mar 5 10:20:22 EST 2019
Wednesday, March 6
Seminar on Theoretical Machine Learning
Topic: Exponentiated Gradient Meets Gradient Descent
Speaker: Will Grathwohl, University of Toronto
Time/Room: 1:30pm - 2:30pm/Princeton University, CS 302
Abstract Link: <http://www.math.ias.edu/seminars/abstract?event=143646>
http://www.math.ias.edu/seminars/abstract?event=143646
The (stochastic) gradient descent and the multiplicative update method are
probably the most popular algorithms in machine learning. We introduce and
study a new regularization which provides a unification of the additive and
multiplicative updates. This regularization is derived from an hyperbolic
analogue of the entropy function, which we call hypentropy. It is motivated
by a natural extension of the multiplicative update to negative numbers. The
hypentropy has a natural spectral counterpart which we use to derive a
family of matrix-based updates that bridge gradient methods and the
multiplicative method for matrices. While the latter is only applicable to
positive semi-definite matrices, the spectral hypentropy method can
naturally be used with general rectangular matrices. We analyze the new
family of updates by deriving tight regret bounds. We study empirically the
applicability of the new update for settings such as multiclass learning, in
which the parameters constitute a general rectangular matrix.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://imap.math.ias.edu/pipermail/iasmathsemo/attachments/20190305/5963fa9a/attachment.html>
More information about the Iasmathsemo
mailing list