This image shows David Holzmüller

David Holzmüller

Dr. rer. nat.

Wissenschaftlicher Mitarbeiter
Institut für Stochastik und Anwendungen
Lehrstuhl für Stochastik




Forschungsschwerpunkt: Trainingsverhalten von neuronalen Netzen und deren Anwendungsmöglichkeiten im Simulationskontext.

Google Scholar

Mein Twitter-Account

David Holzmüller, Viktor Zaverkin, Johannes Kästner, and Ingo Steinwart, A Framework and Benchmark for Deep Batch Active Learning for Regression, Journal of Machine Learning Research, 2023.

Moritz Haas*, David Holzmüller*, Ulrike von Luxburg, and Ingo Steinwart, Mind the spikes: Benign overfitting of kernels and neural networks in fixed dimension, 2023.

David Holzmüller and Francis Bach, Convergence Rates for Non-Log-Concave Sampling and Log-Partition Estimation, 2023.

Viktor Zaverkin, David Holzmüller, Luca Bonfirraro, and Johannes Kästner, Transfer learning for chemically accurate interatomic neural network potentials, Phys. Chem. Chem. Phys., 25, 5383-5396, 2023.

Viktor Zaverkin, David Holzmüller, Ingo Steinwart, and Johannes Kästner, Exploring chemical and conformational spaces by batch mode deep active learning, Digital Discovery, 2022.

Viktor Zaverkin, David Holzmüller, Robin Schuldt, and Johannes Kästner, Predicting properties of periodic systems from cluster data: A case study of liquid water, J. Chem. Phys. 156, 114103, 2022.

David Holzmüller and Ingo Steinwart, Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent, Journal of Machine Learning Research, 2022.

David Holzmüller and Dirk Pflüger, Fast Sparse Grid Operations Using the Unidirectional Principle: A Generalized and Unified Framework, 2021. In: Bungartz, HJ., Garcke, J., Pflüger, D. (eds) Sparse Grids and Applications - Munich 2018. Lecture Notes in Computational Science and Engineering, vol 144. Springer, Cham.

Viktor Zaverkin*, David Holzmüller*, Ingo Steinwart, and Johannes Kästner, Fast and Sample-Efficient Interatomic Neural Network Potentials for Molecules and Materials Based on Gaussian Moments, J. Chem. Theory Comput. 17, 6658–6670, 2021.

David Holzmüller, On the Universality of the Double Descent Peak in Ridgeless Regression, International Conference on Learning Representations, 2021.

Daniel F. B. Haeufle, Isabell Wochner, David Holzmüller, Danny Driess, Michael Günther, and Syn Schmitt, Muscles Reduce Neuronal Information Load: Quantification of Control Effort in Biological vs. Robotic Pointing and Walking, 2020.

David Holzmüller, Improved Approximation Schemes for the Restricted Shortest Path Problem, 2017.

David Holzmüller, Efficient Neighbor-Finding on Space-Filling Curves, 2017.

Zum Porträt des Monats am Fachbereich, Februar 2020

To the top of the page