Department of

Mathematics

Seminar Calendar
for Mathematics and Machine Learning events the year of Tuesday, January 1, 2019.

.
events for the
events containing

Questions regarding events or the calendar should be directed to Tori Corkery.
    December 2018           January 2019          February 2019
Su Mo Tu We Th Fr Sa   Su Mo Tu We Th Fr Sa   Su Mo Tu We Th Fr Sa
1          1  2  3  4  5                   1  2
2  3  4  5  6  7  8    6  7  8  9 10 11 12    3  4  5  6  7  8  9
9 10 11 12 13 14 15   13 14 15 16 17 18 19   10 11 12 13 14 15 16
16 17 18 19 20 21 22   20 21 22 23 24 25 26   17 18 19 20 21 22 23
23 24 25 26 27 28 29   27 28 29 30 31         24 25 26 27 28
30 31


Tuesday, January 22, 2019

4:00 pm in 243 Altgeld Hall,Tuesday, January 22, 2019

Organizational Meeting

George Francis (University of Illinois/Urbana)

Abstract: Kay Kirkpatrick and George Francis invite you to join this seminar on machine learning (ML). It will be more of a mathematical learning collective than a show-and-tell venue. It meets in 243AH on Tuesdays at 4pm except when departmental events (colloquia, MSS and named lectures, spring departmental meeting) are held. Faculty, students, staff, and visitors are welcome. Our goal is to read and ponder papers, and ask each other many more questions than we expect to answer. For this organizational meeting we plan to collect topics you are interested in, and start a list of papers that might containthe answers. Please bring references to papers or websites you would like to study, either actively or passively. This way we might be able to come up with a tentative schedule of events.

Tuesday, January 29, 2019

4:00 pm in 243 Altgeld Hall,Tuesday, January 29, 2019

Backprop in Neural Nets and Automatic Differentiation

George Francis   [email] (University of Illinois at Urbana–Champaign)

Abstract: In 1988 Rumelhart et al brought backpropagation into prominence throughout the Connectionist School of AI (neural nets, hidden layers, deep learning, etc). The technique was used earlier, but had remained obscure til then. Now, 3 decades later, backprop is a well established component of ML theory and practice. But it often comes wrapped in dense mathematical obscurity. In my latter day efforts to understand backprop I finally found some comprehensible answers in Baydin, Pearlmutter, Radul, and Siskind's survey paper "Automatic Differentiation in Machine Learning", J. Machine Learning Res 18 (2018) pp 1-43. I hope to pass along what I learned by working through a very illuminating example, leaving the context and (informal) definitions to the ample Q/A part of the seminar. For more information about our seminar, please see its webpage at http://new.math.uiuc.edu/MathMLseminar/

Tuesday, February 5, 2019

4:05 pm in 243 AH,Tuesday, February 5, 2019

Backprop in NN and AD cont'd

George Francis   [email] (University of Illinois at Urbana)

Abstract: I will finish presenting some items in the handout last week. In particular I hope to explain just how Trask's updating the weights in his program for a machine to learn XOR might be derived from Pearlmutter&Siskind's reverse automatic differentiation recipe. This won't take the entire time, and I hope to answer questions and ask a few myself. There will no new items introduced and the seminar may end early. The temperature is forecasts to be 53F, but with showers.

Tuesday, February 12, 2019

4:00 pm in Altgeld Hall,Tuesday, February 12, 2019

No Seminar Today

Abstract: To encourage faculty members of the seminar to join the 4pm departmental discussion in 245 of the math building design we won't have a seminar today. It will resume next week at the usual time and location.

Monday, February 18, 2019

1:00 pm in Altgeld Hall,Monday, February 18, 2019

To Be Announced

Tuesday, February 19, 2019

4:00 pm in 243 Altgeld Hall,Tuesday, February 19, 2019

Learnability Can Be Undecidable

Jacob Trauger (University of Illinois at Urbana–Champaign)

Abstract: This seminar will be on the paper by Shai Ben-David et al, NATURE Mach. Intel. vol 1, Jan 2019, pp 44–48. The author's abstract reads: "The mathematical foundations of machine learning play a key role in the development of the field. They improve our understanding and provide tools for designing new learning paradigms. The advantages of mathematics, however, sometimes come with a cost. Gödel and Cohen showed, in a nutshell, that not everything is provable. Here we show that machine learning shares this fate. We describe simple scenarios where learnability cannot be proved nor refuted using the standard axioms of mathematics. Our proof is based on the fact the continuum hypothesis cannot be proved nor refuted. We show that, in some cases, a solution to the ‘estimating the maximum’ problem is equivalent to the continuum hypothesis. The main idea is to prove an equivalence between learnability and compression."

Tuesday, March 5, 2019

3:00 pm in 243 Altgeld Hall,Tuesday, March 5, 2019

Modeling Learning and Strategy Formation in Phase Transitions in Cortical Networks

Kesav Krishnan (University of Illinois at Urbana–Champaign)

Abstract: In the first of 2 seminars on this paper by Kozma++ we review the experimental data and their graph-theoretic methods. In the second, we review the mathematical details and offer a critique of their results. Here is a paraphrase of the authors abstract: Learning in mammalian brains is commonly modeled in terms of synaptic connections in a cortical network and the formation of limit cycle oscillators of a dynamical system. Learning is inferred by the re-emergence of the oscillatory regimes by repeating the stimulus. Here the authors use random graphs and boostrap percolation with excitatory and ihibitory vertices. The phase transition from fixed point attractors to limit cycles (Hopf bifurcations) represent changes in cortical networks during category learning. A correspondence with analogous event in the gerbil cortex is based on experiments with electro-cortiographs (ECoG) arrays. They discuss how learning leads to categorization and strategy formation, and how the theoretical modeling results can be used for designing learning and adaptation in computationally aware intelligent machines.

Tuesday, March 12, 2019

3:00 pm in 243 Altgeld Hall,Tuesday, March 12, 2019

To Be Announced

TBA

Tuesday, March 26, 2019

3:00 pm in 243 Altgeld Hall,Tuesday, March 26, 2019

No Seminar This Week

TBA

Saturday, March 30, 2019

1:00 pm in Urbana,Saturday, March 30, 2019

Generalizing Koopman Theory to Allow for Inputs and Control

Kim, Hee Yeon (University of Illinois )

Tuesday, April 2, 2019

4:00 pm in 243 Altgeld Hall,Tuesday, April 2, 2019

Generalizing Koopman Theory to Allow for Inputs and Control

Kim, Hee Yeon (University of Illinois, Urbana-Champaign)

Abstract: The Koopman Operator (Bernard Osgood Koopman "Hamiltonian systems and transformation in Hilbert space", PNAS 17 (1931) 315-318) has emerged in Machine Learning as a tool to reformulate nonlinear dynamics in a linear framework. I will present the paper by Proctor, Brunton, and Kutz in SIAM J.App.Dyn.Sys. (with this title) vol. 17, No. 1, 909-930.

The authors introduce the Koopman Operator with inputs and control (KIC) which generalizes Koopman's spectral theory to allow for systems with nonlinear input-output characteristics. They show how this generalization is connected to dynamic mode decompositions with control (DMDc). They demonstrate KIC on several nonlinear dynamical systems, such as the standard epidemiological SIR-model for susceptible-infectious-recovered, hence resistant subjects (e.g. measles).

Tuesday, April 16, 2019

4:00 pm in 243 Altgeld Hall,Tuesday, April 16, 2019

Visualizing nonlinear dynamical systems like SIR

George K Francis   [email] (University of Illinois at Urbana–Champaign)

Abstract: There is no new presentation this week. But ...

.. for those of you who are interested in programming real-time interactive computer animations of non-linear dynamical systems, like the SIR system we saw last week in Heejeon's seminar on Koopman's theory, I will be there to introduce you to the issues and and problems involved. Recall that the SIR models the epidemiological progress of three populations: Susceptible, Infected, Recovered from the disease (thinks of measles or mumps).

In the first of (possibly) two such workshops I will treat the "continuous" case, which involves some (elementary) integration of 3D differential systems and their steady-states (attractors). In the (tentative) second workshop I will treat the "discrete" case, animating cellular automata, since both are relevant to the SIR model.