
Physics-AI Fellow
Publications
Probing the latent hierarchical structure of data via diffusion models*
– Journal of Statistical Mechanics Theory and Experiment
(2025)
2025,
084005
(doi: 10.1088/1742-5468/aded6c)
How compositional generalization and creativity improve as diffusion models are trained
– Proceedings of the 42nd International Conference on Machine Learning, PMLR 267
(2025)
A phase transition in diffusion models reveals the hierarchical nature of data.
– Proceedings of the National Academy of Sciences
(2025)
122,
e2408799121
(doi: 10.1073/pnas.2408799121)
Computational complexity of deep learning: fundamental limitations and empirical phenomena
– Journal of Statistical Mechanics Theory and Experiment
(2024)
2024,
104008
(doi: 10.1088/1742-5468/ad3a5b)
What can be learnt with wide convolutional neural networks?
– Journal of Statistical Mechanics: Theory and Experiment
(2024)
2024,
104020
(doi: 10.1088/1742-5468/ad65df)
How Deep Neural Networks Learn Compositional Data: The Random Hierarchy Model
– Physical Review X
(2024)
14,
031001
(doi: 10.1103/PhysRevX.14.031001)
Relative stability toward diffeomorphisms indicates performance in deep nets*
– Journal of Statistical Mechanics: Theory and Experiment
(2022)
2022,
114013
(doi: 10.1088/1742-5468/ac98ac)
Locality defeats the curse of dimensionality in convolutional teacher–student scenarios* *This article is an updated version of: Favero A, Cagnetta F and Wyart M 2021 Locality defeats the curse of dimensionality in convolutional teacher–student scenarios Advances in Neural Information Processing Systems vol 34, ed M Ranzato, A Beygelzimer, Y Dauphin, P S Liang and J Wortman Vaughan (New York: Curran Associates) pp 9456–67.
– Journal of Statistical Mechanics Theory and Experiment
(2022)
2022,
114012
(doi: 10.1088/1742-5468/ac98ab)