Florian Rossmannek

About Me

I am an Eric and Wendy Schmidt AI in Science Postdoctoral Fellow at NTU Singapore doing research on the mathematics of machine learning. My mentor is Juan-Pablo Ortega. I obtained my doctoral degree from ETH Zurich in March 2023. My doctoral advisors were Patrick Cheridito and Arnulf Jentzen.

Contact

NTU Singapore, SPMS-MAS-05-32, 21 Nanyang Link, 637371 Singapore

florian [dot] rossmannek [at] ntu [dot] edu [dot] sg

Research Interest

I work on the mathematical foundation of machine learning, with a recent focus on a dynamical/time-series context. I am particularly interested in dynamical and stochastic properties of state-space systems, which provide a general modeling framework. Prior to that, I worked on approximation and optimization problems in (static) neural network theory.

Research Articles

Expand the tabs below for a complete list of my publications and preprints. See also my Google Scholar profile here and my ORCID records here .

  • Gradient descent provably escapes saddle points in the training of shallow ReLU networks (with P. Cheridito and A. Jentzen), J. Optim. Theory Appl., vol 203 (2024) [journal version, arXiv]
  • Landscape analysis for shallow neural networks: complete classification of critical points for affine target functions (with P. Cheridito and A. Jentzen), J. Nonlinear Sci., vol 32, 64 (2022) [journal version, arXiv]
  • A proof of convergence for gradient descent in the training of artificial neural networks for constant target functions (with P. Cheridito, A. Jentzen, and A. Riekert), J. Complexity, vol 72 (2022) [journal version, arXiv]
  • Non-convergence of stochastic gradient descent in the training of deep neural networks (with P. Cheridito and A. Jentzen), J. Complexity, vol 64 (2021) [journal version, arXiv]
  • Efficient approximation of high-dimensional functions with neural networks (with P. Cheridito and A. Jentzen), IEEE Trans. Neural Netw. Learn. Syst., vol 33, no. 7 (2022) [journal version, arXiv]
  • Fading memory and the convolution theorem (with J-P. Ortega) (2024) [arXiv]
  • State-Space Systems as Dynamic Generative Models (with J-P. Ortega) (2024) [arXiv]
  • Efficient Sobolev approximation of linear parabolic PDEs in high dimensions (with P. Cheridito) (2023) [arXiv]
  • An exercise in combinatorics: Christmas Stars (2021) [link]
  • PhD thesis: The curse of dimensionality and gradient-based training of neural networks: shrinking the gap between theory and applications (2023) [link]
  • MSc thesis: Magnetic and Exotic Anosov Hamiltonian Structures (2019) [link]
  • MSc project: Currents in Geometry and Analysis (2019) [link]
  • MSc project: The Moduli Space of Hyperbolic Surfaces, Analytic Teichmüller Theory, and the Pants Graph (2018) [link]
  • BSc thesis: An Introduction to Complex Dynamics and the Mandelbrot Set (2017) [link]

Talks

  • 2024, International Conference on Scientific Computation and Differential Equations (SciCADE), Minisymposium on Geometric and Multiscale Methods for High-Dimensional Dynamics
  • 2024, ETH Zurich, Talks in Financial and Insurance Mathematics
  • 2024, NTU Singapore, Mini-Workshop on Machine Learning Theory and Methodology
  • 2024, NTU Singapore, Workshop on Geometrically Guided Analysis and Design in Optimization and Control
  • 2023, ETH Zurich, Stochastic Finance Group Seminar
  • 2021, ETH Zurich, Post/Doctoral Seminar in Mathematical Finance

Teaching

At ETH Zurich, I was involved with various courses as a teaching assistant or coordinator. Expand the tab below for a complete list of courses.

  • Fall 2022: coordinator for Mathematics I (D-BIOL/CHAB/HEST)
  • Spring 2022: coordinator for Probability and Statistics (D-MATH)
  • Spring 2021: coordinator for Mathematics II (D-BIOL/CHAB/HEST)
  • Fall 2019: co-organizer of an undergraduate seminar on machine learning (D-MATH)
  • Fall 2018: teaching assistant for Analysis I (D-MATH)
  • Fall 2017: teaching assistant for Algorithms and Complexity (D-INFK)
  • Spring 2017: teaching assistant for Topology (D-MATH)
  • Fall 2016: teaching assistant for Algorithms and Complexity (D-INFK)