Professor Anders C. Hansen

Anders

Anders C. Hansen leads the Applied Functional and Harmonic Analysis group within the Faculty of Mathematics at the University of Cambridge and Department of Applied Mathematics and Theoretical Physics (DAMTP). He is a Professor of mathematics  at the University of Cambridge, Professor of Mathematics at the University of Oslo, a Royal Society University Research Fellow and also a Fellow of Peterhouse. For further information, see Wikipedia .

Email: ach70@cam.ac.uk
Tel: +44 1223 760403
Office: F2.01

Resume

Research Interests

Functional Analysis, Artificial Intelligence, Foundations of Computational Mathematics, Solvability Complexity Index hierarchy, Generalised Hardness of Approximation, Optimisation, Inverse Problems, Medical Imaging,  Operator/Spectral Theory, Numerical Analysis, Computational Harmonic Analysis, PDEs, Compressed Sensing,  Mathematical Signal Processing, Sampling Theory, Geometric Integration, Operator Algebras

Selected Talks and Events

  1. Plenary speaker at 10th International Conference on Mathematical Methods for Curves and Surfaces  (June 26-28, 2024).
  2. Plenary speaker at Nordic Perspectives on Artificial Intelligence  (Oct. 12-13, 2023).
  3. Organizing the workshop Computational mathematics in computer assisted proofs  (Sept 12-16, 2022) together with Charles Fefferman and Svetlana Jitomirskaya.
  4. Plenary speaker at Thirty years of Acta Numerica  (26 June - 02 July 2022).
  5. Speaking at King's College London Mathematics Colloquium  (12 May, 2022).
  6. Organizing the workshop Interpretability, safety, and security in AI  (Dec 13-15, 2021) together with Rich Baraniuk , Miguel Rodrigues and Adrian Weller.
  7. Speaking (online) at the University of Chicago Mathematics Colloquium  (April 7, 2021).
  8. Speaking (online) at the Cambridge Science Festival  (March 29, 2021).
  9. Speaking at the University of Minnesota, Applied and Computational Math Colloquium  (Feb. 3 2020)
  10. Plenary speaker at the  National Academy of Sciences, Arthur M. Sackler Colloquim: The Science of Deep Learning, Washington D.C. (March 2019).
  11. Plenary speaker at SPARS (2017).
  12. Plenary speaker at Structured Regularization for High-Dimensional Data Analysis, Institut Henri Poincaré (2017).
  13. Plenary speaker at Strobl16: Time-Frequency Analysis and Related Topics (2016).
  14. Plenary speaker at UCL-Duke Workshop on Sensing and Analysis of High-Dimensional Data (2014).

Prizes and Awards

1. PROSE Award Finalist 2022 - Computing & Information Science.

2. Whitehead Prize 2019.

3. 2018 IMA Prize in Mathematics and Applications.

4. Leverhulme Prize in Mathematics and Statistics 2017.

5. Royal Society University Research Fellow 2012.

News

1. SIAM News reports (front page) on our work in the recent May edition:  Proving existence is not enough: Mathematical paradoxes unravel the limits of neural networks in AI.

2. IEEE Spectrum Magazine reports on our paper  "The difficulty of computing stable and accurate neural networks: On the barriers of deep learning and Smale's 18th problem." 

3. Proc. Natl. Acad. Sci. published our paper  "The difficulty of computing stable and accurate neural networks: On the barriers of deep learning and Smale's 18th problem."  Here is the announcement from Cambridge University News.   Further press coverage here.  

4. PROSE Award Finalist 2022 - Computing & Information Science

for our book  Compressive Imaging: Structure, Sampling, Learning  (with B. Adcock) on  Cambridge University Press.

5. SIAM News reports (front page) on our work from the paper  "The mathematics of adversarial attacks in AI -- Why deep learning is unstable despite the existence of stable neural networks"  in the recent October edition: Deep Learning: What Could Go Wrong?.

6. SIAM News reports on our work on deep learning in scientific computing in the recent March edition: Deep Learning in Scientific Computing: Understanding the Instability Mystery.

7. Proc. Natl. Acad. Sci. published our paper  On instabilities of deep learning in image reconstruction and the potential costs of AI Here is some of the press coverage: Cambridge University News,   Physics World,   EurekAlert,   The Register,   Health Care Business,   Radiology Business,   Science Daily,   Psychology Today,   Government Computing,   Diagnostic Imaging,   News Medical,   Press Release Point,   Tech Xplore,   Aunt Minnie,   My Science,   Digit,   The Talking Machines,   MC.AI,   Rama on Healthcare,   News8PLus,   Genethique,   Healthcare in Europe,   AuntminnieEurope,   Newsbreak,   AI Development Hub,   FirstWord MedTech,   AI Daily.  

8. Our paper How to compute spectra with error control is on the cover of the last June edition of Physical Review Letters.

9. The Sackler Colloquium at the US National Academy of Sciences: "The Science of deep learning". Watch the presentation "On instabilities in deep learning - Does AI come at a cost?" 

10. SIAM News has our work on the Restricted Isometry Property in Levels in compressed sensing on the front page of the October edition: From Global to Local: Getting More from Compressed Sensing.

11. Siemens validated in practice, using a modified MRI machine, the asymptotic sparsity, asymptotic incoherence and high resolution concepts introduced by our work (see Breaking the coherence barrier: A new theory for compressed sensing and also On asymptotic structure in compressed sensing). From their conclusion:

“[...] The image resolution has been greatly improved [...]. Current results practically demonstrated that it is possible to break the coherence barrier by increasing the spatial resolution in MR acquisitions. This likewise implies that the full potential of the compressed sensing is unleashed only if asymptotic sparsity and asymptotic incoherence is achieved.”

Their work Novel Sampling Strategies for Sparse MR Image Reconstruction was published in May 2014 in the Proceedings of the International Society for Magnetic Resonance in Medicine.

Students and Post-Docs

Phd Students: 1. Clarice Poon (graduated 2015), 2. Milana Gataric (graduated 2016), 3. Alexander Jones (graduated 2016), 4. Alexander Bastounis (graduated 2018), 5. Vegard Antun (graduated 2020), 6. Matt Colbrook (graduated 2020), 7. Laura Thesing (graduated 2022), 8. Simon Becker (graduated 2022), 9. Nina Gottschling (graduated 2023), 10. Paolo Campodonico (graduating 2024), 11. David Liu (graduating 2024), 12. Luca Gazdag (graduating 2024), 13. Johan Wind (graduating 2025), 14. Emil Haugen (graduating 2027), 15. George Coote (graduating 2027).

Post-docs: 1. Jonathan Ben-Artzi ( 2011-2014, PhD: Brown University), 2. Bogdan Roman ( 2013-2016, 2016-2019, PhD: University of Cambridge), 3. Priscilla Canizares (2015-2016, PhD: Autonomous University of Barcelona), 4. Milana Gataric (2015-2016, PhD: University of Cambridge), 5. Francesco Renna (2016-2018, PhD: University of Padova), 6. Alexander Bastounis (2019-2021, PhD: University of Cambridge), 7. Vegard Antun (2020-, PhD: University of Oslo), 8. Alexei Stepanenko (2022-2023, PhD: Cardiff University).

Teaching

NST Part IA Mathematical Methods I - Course A.

Part II Numerical Analysis.

Part III course on Compressed Sensing.

Editor

Proceedings of the Royal Society Series A (2014-2020)

Networks & Heterogeneous Media (2021- )

SIAM Journal on Imaging Sciences (2022- )

BIT Numerical Mathematics (2023- )

Books 

  1. B. Adcock, A. C. Hansen, Compressive Imaging: Structure, Sampling, Learning
    Cambridge University Press (2021)

Selected Papers 

  1. A. C. Hansen, On the Solvability Complexity Index, the n-Pseudospectrum and Approximations of Spectra of Operators,
    J. Amer. Math. Soc.
    24, no. 1, 81-124
  2. J. Ben-Artzi, M. Colbrook, A. C. Hansen, O. Nevanlinna, M. Seidel, Computing spectra - On the Solvability Complexity Index hierarchy and towers of algorithms.
  3. A. Bastounis, A. C. Hansen, V. Vlacic,  The extended Smale's 9th problem. 
  4. V. Antun, F. Renna, C. Poon, B. Adcock, A. C. Hansen, On instabilities of deep learning in image reconstruction and the potential costs of AI
    Proc. Natl. Acad. Sci.
    2020, no. 5, 201907377
  5. M, Colbrook, V. Antun, A. C. Hansen, The difficulty of computing stable and accurate neural networks: On the barriers of deep learning and Smale's 18th problem. 
  6. Proc. Natl. Acad. Sci.
    2022 no. 119 (12) e2107151119
  7. B. Adcock, A. C. Hansen, C. Poon, B. Roman, Breaking the coherence barrier: A new theory for compressed sensing,
    Forum of Mathematics, Sigma 5(4):1-84
  8. M. Colbrook, A. C. Hansen, The foundations of spectral computations via the Solvability Complexity Index hierarchy  
  9. J. Eur. Math. Soc. 2022
  10. B. Adcock, A. C. Hansen, Generalized Sampling and Infinite Dimensional Compressed Sensing,
    Found. Comp. Math. 16, no. 5, 1263-1323
  11. M. Colbrook, B. Roman, A. C. Hansen, How to compute spectra with error control
    Phys. Rev. Lett.
    122, 250201 (front cover)

SIAM News  

  1. V. Antun, M, Colbrook, A. C. Hansen, Proving existence is not enough: Mathematical paradoxes unravel the limits of neural networks in AI. 
  2. SIAM News, 50, no. 8 May 2022 (front cover)
  3. A. Bastounis, A. C. Hansen, D. Higham, I. Tyukin, V. Vlacic,  Deep Learning: What Could Go Wrong? 
  4. SIAM News, 54, no. 8 October 2021 (front cover)
  5. V. Antun, N. Gottschling, A. C. Hansen, B. Adcock, Deep Learning in Scientific Computing: Understanding the Instability Mystery.
    SIAM News, 54, no. 2 March 2021
  6. A. Bastounis, B. Adcock, A. C. Hansen, From Global to Local: Getting More from Compressed Sensing,
    SIAM News, 50, no. 8 October 2017 (front cover)

Papers in Chronological Order 

  1. A. Bastounis, F. Cucker, A. C. Hansen,  When can you trust feature selection? -- I: A condition-based analysis of LASSO and generalised hardness of approximation. 
  2. A. Bastounis, F. Cucker, A. C. Hansen,  When can you trust feature selection? -- II: On the effects of random data on condition in statistics and optimisation. 
  3. Z. Liu, A. C. Hansen,  Do stable neural networks exist for classification problems? -- A new view on stability in AI. 
  4. N. Gottschling, P. Campodonico, V. Antun, A. C. Hansen, On the existence of optimal multi-valued decoders and their accuracy bounds for undersampled inverse problems .
  5. J. S. Wind, V. Antun, A. C. Hansen,  Implicit regularization in AI meets generalized hardness of approximation in optimization -- Sharp results for diagonal linear networks. 
  6. J. Ben-Artzi, M. Colbrook, A. C. Hansen, O. Nevanlinna, M. Seidel, Computing spectra - On the Solvability Complexity Index hierarchy and towers of algorithms.
  7. A. Bastounis, A. C. Hansen, V. Vlacic,  The mathematics of adversarial attacks in AI -- Why deep learning is unstable despite the existence of stable neural networks. 
  8. A. Bastounis, A. C. Hansen, V. Vlacic,  The extended Smale's 9th problem -- On computational barriers and paradoxes in estimation, regularisation, computer-assisted proofs and learning. 
  9. S. Becker, A. C. Hansen, Computing solutions of Schrodinger equations on unbounded domains - On the brink of numerical algorithms. 
  10. N. Gottschling, V. Antun, B. Adcock, A. C. Hansen, The troublesome kernel -- On hallucinations, no free lunches and the accuracy-stability trade-off in inverse problems. 
  11. SIAM Review. 
  12. L. Gazdag, A. C. Hansen,  Generalised hardness of approximation and the SCI hierarchy - On determining the boundaries of training algorithms in AI. 
  13. Found. Comp. Math. 
  14. L. Thesing, A. C. Hansen,  Which neural networks can be computed by an algorithm? -- Generalised hardness of approximation meets Deep Learning. 
  15. Proc. Appl. Math. Mech. 2022;22:1 e202200174.
  16. V. Antun, M, Colbrook, A. C. Hansen, Proving existence is not enough: Mathematical paradoxes unravel the limits of neural networks in AI. 
  17. SIAM News, 50, no. 8 May 2022
  18. V. Antun, M, Colbrook, A. C. Hansen, The difficulty of computing stable and accurate neural networks: On the barriers of deep learning and Smale's 18th problem. 
  19. Proc. Natl. Acad. Sci. 2022 no. 119 (12) e2107151119
  20. T. Loss, M, Colbrook, A. C. Hansen, Stratified Sampling Based Compressed Sensing for Structured Signals. 
  21. IEEE Trans. Signal Process, vol. 70, pp. 3530-3539, (2022)
  22. M. Colbrook, A. C. Hansen, The foundations of spectral computations via the Solvability Complexity Index hierarchy. 
  23. J. Eur. Math. Soc. (2022)
  24. A. Bastounis, A. C. Hansen, D. Higham, I. Tyukin, V. Vlacic,  Deep Learning: What Could Go Wrong? 
  25. SIAM News, 54, no. 8 October 2021
  26.  L. Thesing, V. Antun, A. C. Hansen, What do AI algorithms actually learn - On false structures in deep learning. 
  27. V. Antun, N. Gottschling, A. C. Hansen, B. Adcock, Deep Learning in Scientific Computing: Understanding the Instability Mystery.
    SIAM News, 54, no. 2 March 2021
  28.  L. Thesing, A. C. Hansen, Non-uniform recovery guarantees for binary measurements and infinite-dimensional compressed sensing. 
  29. J. Fourier Anal. Appl. 27, 14 (2021)
  30. B. Adcock, V. Antun,  A. C. Hansen, Uniform recovery in infinite-dimensional compressed sensing and applications to structured binary sampling
  31. Appl. Comput. Harmon. Anal. Volume 55:1-40 (2021)
  32. V. Antun, F. Renna, C. Poon, B. Adcock, A. C. Hansen, On instabilities of deep learning in image reconstruction and the potential costs of AI
    Proc. Natl. Acad. Sci.
    2020, no. 5, 201907377
  33. J. Schoormans, G. J. Strijkers, A. C. Hansen, A. J. Nederveen, B. F. Coolen, Compressed Sensing MRI with Variable Density Averaging (CS-VDA) Outperforms Full Sampling at Low SNR.
  34. Phys. Med. Biol.   65 045004 (2020) 
  35. M. Colbrook, B. Roman, A. C. Hansen, How to compute spectra with error control
    Phys. Rev. Lett.
    122, 250201
  36. A. C. Hansen, B. Roman, Structure and Optimisation in Computational Harmonic Analysis: On Key Aspects in Sparse Regularisation
    Springer Optimization and Its Applications vol. 168: 125-172 (2021) 
  37. M. Colbrook, A. C. Hansen, On the Infinite-dimensional QR Algorithm, 
    Numerische Mathematik 143:17Ð83 (2019)
  38. R. Calderbank, A. C. Hansen, L. Thesing, B. Roman On reconstructions from measurements with binary functions,  
    Applied and Numerical Harmonic Analysis. Birkhauser 97-128 (2019) 
  39. L. Thesing, A. C. Hansen, Linear reconstructions and the analysis of the stable sampling rate, 
    Sampl. Theory Signal Image Process.  17:103-126 (2018) 
  40. A. C. Hansen, L. Thesing, On the Stable Sampling rate for binary measurements and wavelet reconstruction, 
    Appl. Comput. Harmon. Anal. 
    48(2): 630-654 (2020) 
  41. A. Bastounis, B. Adcock, A. C. Hansen, From Global to Local: Getting More from Compressed Sensing,
    SIAM News, 50, no. 8 October 2017
  42. A. C. Hansen, L. Thesing, Sampling from binary measurements - On Reconstructions from Walsh coefficients,
    IEEE 2017 Int. Conf. on Samp. Theory and Appl. 256-260 (2017)
  43. A. Bastounis, A. C. Hansen, On the absence of uniform recovery in many real-world applications of compressed sensing and the RIP & nullspace property in levels.
    SIAM Jour. Imag. Scienc.
    10(1):335-371
  44. B. Adcock, A. C. Hansen, C. Poon, B. Roman, Breaking the coherence barrier: A new theory for compressed sensing,
    Forum of Mathematics, Sigma 5(4):1-84
  45. A. C. Hansen, O. Nevanlinna, Complexity Issues in Computing Spectra, Pseudospectra and Resolvents
    Banach Centre Pub. 112:171-194
  46. B. Adcock, M. Gataric, A. C. Hansen, Density theorems for nonuniform sampling of bandlimited functions using derivatives or bunched  measurements,
    J. Fourier Anal. Appl. 23(6):1311-1347
  47. B. Adcock, A. C. Hansen, B. Roman, A note on compressed sensing of structured sparse wavelet coefficients from subsampled Fourier measurements,
    IEEE Signal Process. Lett.  23(5):732 - 736 
  48. A. Jones , A. Tamtogl, I. Calvo-Almazan, A. C. Hansen, Continuous compressed sensing of inelastic and quasielastic Helium Atom Scattering spectra,
    Nature, Sci. Rep. 6, Art. num.: 27776
  49. A. Jones , A. Tamtogl, I. Calvo-Almazan, A. C. Hansen, Continuous compressed sensing of inelastic and quasielastic Helium Atom Scattering spectra (supplementary material),
    Nature, Sci. Rep. 6, Art. num.: 27776
  50. J. Ben-Artzi, A. C. Hansen, O. Nevanlinna, M. Seidel, New barriers in complexity theory: On the Solvability Complexity Index and towers of algorithms,
    C. R. Acad. Sci. Paris Sér. I Math. 353, no. 10, 931-936
  51. J. Ben-Artzi, A. C. Hansen, O. Nevanlinna, M. Seidel, The Solvability Complexity Index - Computer science and logic meet scientific computing.
  52. B. Adcock, M. Gataric, A. C. Hansen, Recovering piecewise smooth functions from nonuniform Fourier measuremets,
    Springer Lect. Notes in Comp. Sci. and Eng. 2015
  53. A. Bastounis, A. C. Hansen, On random and deterministic compressed sensing and the Restricted Isometry Property in Levels,
    IEEE 2015 Int. Conf. on Samp. Theory and Appl.
  54. B. Adcock, A. C. Hansen, M. Gataric, Weighted frames of exponentials and stable recovery of multidimensional functions from nonuniform Fourier samples,
    Appl. Comput. Harmon. Anal. 
    42(3):508-535
  55. B. Adcock, M. Gataric, A. C. Hansen, Stable nonuniform sampling with weighted Fourier frames and recovery in arbitrary spaces,
    IEEE 2015 Int. Conf. on Samp. Theory and Appl.
  56. B. Adcock, A. C. Hansen, A. Jones, On asymptotic incoherence and its implications for compressed sensing for inverse problems,
    IEEE Trans. Inf. Theory,
    62, no. 2, 1020-1032
  57. B. Roman, B. Adcock, A. C. Hansen, On asymptotic structure in compressed sensing.
  58. B. Adcock, G. Kutyniok, A. C. Hansen, J. Ma, Linear Stable Sampling Rate: Optimality of 2D Wavelet Reconstructions from Fourier Measurements,
    SIAM J. Math. Anal.
    47(2), 1196–1233
  59. B. Adcock, A. C. Hansen, Generalized Sampling and Infinite Dimensional Compressed Sensing,
    Found. Comp. Math. 16, no. 5, 1263-1323
  60. B. Adcock, A. C. Hansen, B. Roman The quest for optimal sampling: computationally efficient, structure-exploiting measurements for compressed sensing,
    Springer
    , 2015
  61. B. Roman, A. Bastounis, B. Adcock, A. C. Hansen, On fundamentals of models and sampling in compressed sensing.
  62. A. Jones, B. Adcock, A. C. Hansen Analyzing the structure of multidimensional compressed sensing problems through coherence.
  63. B. Adcock, M. Gataric, A. C. Hansen, On stable reconstructions from univariate nonuniform Fourier measurements,
    SIAM Jour. Imag. Scienc.
    7(3):1690-1723
  64. B. Adcock, A. C. Hansen, B. Roman, G. Teschke, Generalized sampling: stable reconstructions, inverse problems and compressed sensing over the continuum,
    Adv. in Imag. and Electr. Phys.
    vol 182, 187-279, Elsevier, 2014
  65. B. Adcock, A. C. Hansen, A. Shadrin, A stability barrier for reconstructions from Fourier samples,
    SIAM Jour. on Num. Anal. 
    52, no. 1, 125-139
  66. B. Adcock, A. C. Hansen, C. Poon, B. Roman, Breaking the coherence barrier: asymptotic incoherence and asymptotic sparsity in compressed sensing,
    Proc. of the 10th Int. Conf. on Samp. Theory and Appl., 2013
  67. B. Adcock, A. C. Hansen, C. Poon, Optimal wavelet reconstructions from Fourier samples via generalized sampling,
    Proc. of the 10th Int. Conf. on Samp. Theory and Appl., 2013
  68. B. Adcock, A. C. Hansen, C. Poon, Beyond Consistent Reconstructions: Optimality and Sharp Bounds for Generalized Sampling, and Application to the Uniform Resampling Problem,
    SIAM J. Math. Anal. 
    45, no. 5, 3132-3167
  69. B. Adcock, A. C. Hansen, C. Poon, On optimal wavelet reconstructions from Fourier samples: linearity and universality of the stable sampling rate,
    Appl. Comput. Harmon. Anal.
     36, no. 3, 387-415
  70. B. Adcock, A. C. Hansen, Generalized sampling and the stable and accurate reconstruction of piecewise analytic functions from their Fourier coefficients,
    Math. Comp. 84, 237-270
  71. B. Adcock, A. C. Hansen, E. Herrholz, G. Teschke, Generalized Sampling: Extensions to Frames and Inverse and Ill-Posed Problems,
    Inverse Prob.
    29, no 1, 015008
  72. B. Adcock, A. C. Hansen, Reduced Consistency Sampling in Hilbert Spaces,
    Proc. of the 9th Int. Conf. on Samp. Theory and Appl., 2011
  73. B. Adcock, A. C. Hansen, Stable reconstructions in Hilbert spaces and the resolution of the Gibbs phenomenon,
    Appl. Comput. Harmon. Anal.
    32, no. 3, 357-388
  74. B. Adcock, A. C. Hansen, A Generalized Sampling Theorem for Stable Reconstructions in Arbitrary Bases,
    J. Fourier Anal. Appl.
    18, no. 4, 685-716
  75. A. C. Hansen, A theoretical framework for backward error analysis on manifolds,
    J. Geom. Mech. 3, no. 1, 81 - 111
  76. A. C. Hansen, On the Solvability Complexity Index, the n-Pseudospectrum and Approximations of Spectra of Operators,
    J. Amer. Math. Soc.
    24, no. 1, 81-124
  77. A. C. Hansen, J. Strain, On the order of deferred correction,
    Appl. Numer. Math.
    61, no. 8, 961-973
  78. A. C. Hansen, Infinite dimensional numerical linear algebra; theory and applications,
    Proc. R. Soc. Lond. Ser. A. 466, no. 2124, 3539-3559
  79. A. C. Hansen, On the approximation of spectra of linear operators on Hilbert spaces,
    J. Funct. Anal.
    254, no. 8, 2092--2126
  80. A. C. Hansen, J. Strain, Convergence theory for spectral deferred correction,
    Preprint, UC Berkeley

Previous Events

  1. Organizing the workshop Computational mathematics in computer assisted proofs  (Sept 12-16, 2022) together with Charles Fefferman and Svetlana Jitomirskaya.
  2. Plenary speaker at Thirty years of Acta Numerica  (26 June - 02 July 2022).
  3. Speaking at King's College London Mathematics Colloquium  (12 May, 2022).
  4. Speaking (online) at University of Oxford Data Science Seminar  (Feb. 21, 2022).
  5. Speaking (online) at IST  (Jan. 20, 2022).
  6. Organizing the workshop Interpretability, safety, and security in AI  (Dec 13-15, 2021) together with Rich Baraniuk , Miguel Rodrigues and Adrian Weller.
  7. Speaking (online) at the University of Leicester  (Oct. 14, 2021).
  8. Speaking at EPFL  (Sept. 21, 2021).
  9. Speaking (online) at the University of Chicago Mathematics Colloquium  (April 7, 2021).
  10. Speaking (online) at the Cambridge Science Festival  (March 29, 2021).
  11. Speaking (online) at the One World IMAGing and INvErse problems (IMAGINE) seminar  (Feb 17, 2021).
  12. Speaking (online) at the Gran Sasso Science Institute Mathematics Colloquium  (Jan 28, 2021).
  13. Speaking (online) at XAI: Explaining what goes on inside DNN/AI  (Oct 20, 2020).
  14. Speaking (online) at the Max Planck Institute of Molecular Cell Biology and Genetics  (Sept 24, 2020).
  15. Speaking (online) at the Mathematics of Machine Learning, LMS-Bath Symposium  (Aug 6, 2020). Watch the talk.
  16. Speaking (online) at the One World Seminar Series on the Mathematics of Machine Learning  (July 5, 2020). Watch the talk.
  17. Speaking at the University of Minnesota, Applied and Computational Math Colloquium  (Feb. 3 2020)
  18. Invited speaker at Computational Harmonic Analysis and Data Science, Banff International Research Station (Nov 2019).
  19. Speaking at EPFL, Imaging in the Age of Machine Learning (Oct 25, 2019)
  20. Speaking at the University of Pittsburgh, Algebra-combinatorics-geometry seminar (Sept 26, 2019)
  21. Invited speaker at Workshop on Harmonic analysis and Machine Learning (Sept 2019).
  22. Invited speaker at Algorithms and Complexity for Continuous Problems, Dagstuhl (Aug 2019).
  23. Plenary speaker at National Academy of Sciences, Arthur M. Sackler Colloquim: The Science of Deep Learning, Washington D.C. (March 2019).
  24. Speaking at Imperial College/University College London, Numerical Analysis Seminar (Feb. 20 2019)
  25. Invited speaker at Variational methods and optimization in imaging, Institut Henri Poincaré (Feb. 2019).
  26. Speaking at Imperial College, Pure Analysis Seminar (Jan. 10 2019).
  27. Invited speaker at Analysis and Computation in High Dimensions, Hausdorff Institute (Oct. 2018).
  28. Invited speaker at Measuring the Complexity of Computational Content: From Combinatorial Problems to Analysis, Dagstuhl (Sept. 2018).
  29. Invited speaker at the Algebraic and geometric aspects of numerical methods for differential equations, Mittag-Leffler Institute (July 5 2018)
  30. Invited speaker at Isaac Newton Institute (May 24 2018)
  31. Speaking at the University of Oslo (May 14-16 2018, slides).
  32. Speaking at the University of Manchester (May 4 2018).
  33. Invited speaker at  Banff Research Station (April 25 2018).
  34. Invited speaker at the Institut Henri Poincaré (Feb 12 2018).
  35. Organizing the program Approximation, sampling and compression in data science, Isaac Newton Institute (Jan-June 2019).
  36. Organizing the workshop Mathematics of data: Structured representations for sensing, approximation and learning, Alan Turing Institute (May 27-May 31, 2019).
  37. Speaking at LMU Munich (Jan 31, 2018).
  38. Organizing the workshop Inverse Problems Network Meeting 2, Isaac Newton Institute (Nov 23-Nov 24, 2017).
  39. Speaking at the University of Warwick (Nov 15, 2017).
  40. Invited speaker at Generative models, parameter learning and sparsity, Isaac Newton Institute (2017).
  41. Plenary speaker at the Fourteenth International Conference on Computability and Complexity in Analysis (2017).
  42. Plenary speaker at SPARS (2017).
  43. Plenary speaker at Structured Regularization for High-Dimensional Data Analysis, Institut Henri Poincaré (2017).
  44. Keynote speaker at FoCM: Approximation Theory Workshop (2017).
  45. Invited speaker at FoCM: Information-Based Complexity Workshop (2017).
  46. Invited speaker at Multiscale and High-Dimensional Problems, Oberwolfach (2017).
  47. Plenary speaker at The 14th International workshop on Quantum Chromodynamics (QCD) in extreme conditions (2016).
  48. Plenary speaker at Strobl16: Time-Frequency Analysis and Related Topics (2016).
  49. Plenary speaker at Computational and Analytic Problems in Spectral Theory (2016).
  50. Invited speaker at Low Complexity Models in Signal Processing, Hausdorff Institute (2016).
  51. Plenary speaker at The Bath/RAL Numerical Analysis Day (2015).
  52. Plenary speaker at UCL-Duke Workshop on Sensing and Analysis of High-Dimensional Data (2014).
  53. Plenary speaker at Pseudospectra of operators: spectral singularities, semiclassics, pencils and random matrices (2014).
  54. Invited speaker at FoCM: Real Number Complexity Workshop (2014).
  55. Plenary speaker at iTWIST'14 (2014).
  56. Plenary speaker at French-German Conference on Mathematical Image Analysis, Institut Henri Poincaré (2014).
  57. Invited speaker at The 5th International Conference on Computational Harmonic Analysis (2014).
  58. Invited speaker at Compressed sensing and its Applications (2013).
  59. Plenary speaker at Sparse Representation of Functions: Analytic and Computational Aspects (2012).
  60. Plenary speaker at Sparsity, Localization and Dictionary Learning (2012).

Thesis

A. C. Hansen, On the approximation of spectra of linear Hilbert space operators, PhD Thesis.

Student Awards

  1. Smith-Knight/Rayleigh-Knight Prize 2007, On the approximation of spectra and pseudospectra of linear operators on Hilbert spaces
  2. John Butcher Award 2007 (joint with T. Schmelzer (Oxford)), A theoretical framework for backward error analysis on manifolds.

  • © Applied Functional and Harmonic Analysis, Centre for Mathematical Sciences, Wilberforce Road, Cambridge CB3 0WA.
    Page last saved: 11th November 2019. Contact Us