Department of

Mathematics


Seminar Calendar
for events the day of Thursday, April 8, 2021.

     .
events for the
events containing  

(Requires a password.)
More information on this calendar program is available.
Questions regarding events or the calendar should be directed to Tori Corkery.
      March 2021             April 2021              May 2021      
 Su Mo Tu We Th Fr Sa   Su Mo Tu We Th Fr Sa   Su Mo Tu We Th Fr Sa
     1  2  3  4  5  6                1  2  3                      1
  7  8  9 10 11 12 13    4  5  6  7  8  9 10    2  3  4  5  6  7  8
 14 15 16 17 18 19 20   11 12 13 14 15 16 17    9 10 11 12 13 14 15
 21 22 23 24 25 26 27   18 19 20 21 22 23 24   16 17 18 19 20 21 22
 28 29 30 31            25 26 27 28 29 30      23 24 25 26 27 28 29
                                               30 31               

Thursday, April 8, 2021

11:00 am in zoom,Thursday, April 8, 2021

Deformations of symplectic foliations

Marco Zambon (KU Leuven)

Abstract: Symplectic foliations and regular Poisson structures are the same thing. Taking the latter point of view, we exhibit an algebraic structure that governs the deformations of symplectic foliations, i.e. which allows to describe the space of symplectic foliations nearby a given one. Using this, we will address the question of when it is possible to prolong a first order deformation to a smooth path of symplectic foliations. We will be especially interested in the relation to the underlying foliation. This is joint work in progress with Stephane Geudens and Alfonso Tortorella.

3:00 pm in Zoom,Thursday, April 8, 2021

Recent advances in analysis of implicit bias of gradient descent on deep networks

Matus Telgarsky (UIUC)

Abstract: The purpose of this talk is to highlight three recent directions in the study of implicit bias, a promising approach to developing a tight generalization theory for deep networks interwoven with optimization. The first direction is a warm-up with purely linear predictors: here, the implicit bias perspective gives the fastest known hard-margin SVM solver! The second direction is on the early training phase with shallow networks: here, implicit bias leads to good training and testing error, with not just narrow networks but also arbitrarily large ones. The talk concludes with deep networks, providing a variety of structural lemmas that capture foundational aspects of how weights evolve for any width and sufficiently large amounts of training. This is joint work with Ziwei Ji.

To register: https://berkeley.zoom.us/webinar/register/WN_iEXcldw1QPOuUofhS0WT4g