Florian Rossmannek
Postdoctoral Fellow | NTU Singapore
Recent Highlights
- Invited talk at the Differential Equations for Data Science online seminar
- New preprint on the arXiv here
- Research visit at ETH Zurich
- New preprint on the arXiv here
- Invited talk at the Quantitative Finance Conference 2025 @NUS (picture)
- Research visit at the University of Tokyo
Research Interest
I work on the mathematical foundation of machine learning, with a recent focus on a dynamical/time-series context.
I am particularly interested in dynamical and stochastic properties of state-space systems, which provide a general modeling framework.
Prior to that, I worked on approximation and optimization problems in (static) neural network theory.
Research Articles
Publications
- Fading memory and the convolution theorem (with J-P. Ortega), IEEE Trans. Autom. Control, to appear, 2025 [journal(), arXiv, bib]
- State-space systems as dynamic generative models (with J-P. Ortega), Proc. R. Soc. A, vol 481(2309), 2025 [journal(), arXiv, bib]
- Gradient descent provably escapes saddle points in the training of shallow ReLU networks (with P. Cheridito and A. Jentzen), J. Optim. Theory Appl., vol 203(3), 2024 [journal(), arXiv, bib]
- Landscape analysis for shallow neural networks: complete classification of critical points for affine target functions (with P. Cheridito and A. Jentzen), J. Nonlinear Sci., vol 32(5), 2022 [journal(), arXiv, bib]
- A proof of convergence for gradient descent in the training of artificial neural networks for constant target functions (with P. Cheridito, A. Jentzen, and A. Riekert), J. Complexity, vol 72, 2022 [journal(), arXiv, bib]
- Non-convergence of stochastic gradient descent in the training of deep neural networks (with P. Cheridito and A. Jentzen), J. Complexity, vol 64, 2021 [journal(), arXiv, bib]
- Efficient approximation of high-dimensional functions with neural networks (with P. Cheridito and A. Jentzen), IEEE Trans. Neural Netw. Learn. Syst., vol 33(7), 2022 [journal(), arXiv, bib]
Preprints
- Echoes of the past: A unified perspective on fading memory and echo states (with J-P. Ortega), 2025 [arXiv, bib]
- Stochastic dynamics learning with state-space systems (with J-P. Ortega), 2025 [arXiv, bib]
- Efficient Sobolev approximation of linear parabolic PDEs in high dimensions (with P. Cheridito), 2023 [arXiv, bib]
Miscellaneous
- PhD thesis: The curse of dimensionality and gradient-based training of neural networks: shrinking the gap between theory and applications (2023) [link, bib]
- An exercise in combinatorics: Christmas Stars (2021) [pdf]
- MSc thesis: Magnetic and Exotic Anosov Hamiltonian Structures (2019)
Teaching
At ETH Zurich, I was involved with various courses as a teaching assistant or coordinator.
Expand the tab below for a complete list of courses.
Courses
- Fall 2022: coordinator for Mathematics I (D-BIOL/CHAB/HEST)
- Spring 2022: coordinator for Probability and Statistics (D-MATH)
- Spring 2021: coordinator for Mathematics II (D-BIOL/CHAB/HEST)
- Fall 2019: co-organizer of an undergraduate seminar on machine learning (D-MATH)
- Fall 2018: teaching assistant for Analysis I (D-MATH)
- Fall 2017: teaching assistant for Algorithms and Complexity (D-INFK)
- Spring 2017: teaching assistant for Topology (D-MATH)
- Fall 2016: teaching assistant for Algorithms and Complexity (D-INFK)
Resume
Employment
- 2023 - 2025Postdoctoral FellowNTU Singapore
- 2019 - 2023Research AssistantETH Zurich
Education
- 2019 - 2023PhD MathematicsETH Zurich
- 2017 - 2019MSc MathematicsETH Zurich
- 2014 - 2018BSc MathematicsETH Zurich
Conference Talks
- Quantitative Finance Conference 2025; Track: Advances in Quantitative Finance and Econometrics; NUS, Singapore
- DEDS2025 (Differential Equations for Data Science); Kyoto, Japan
- SciCADE2024 (International Conference on Scientific Computation and Differential Equations); Minisymposium on Geometric and Multiscale Methods for High-Dimensional Dynamics; NUS, Singapore