skip to content
 

We study optimization in its broad sense. From mathematical foundations of optimization algorithms, to development of new methods to solve optimization problems (with and without machine learning), including proposing new frameworks to solve ML problems.

Read more at: Faster PET Reconstruction by Stochastic Optimisation

Faster PET Reconstruction by Stochastic Optimisation

Researcher: Matthias Ehrhardt, and Carola-Bibiane Schönlieb 


Read more at: Geometric Integration Methods for Optimisation

Geometric Integration Methods for Optimisation

Researcher: Erlend Riis, and Carola-Bibiane Schönlieb


Read more at: Mathematical challenges in electron tomography

Mathematical challenges in electron tomography

Researchers: Willem Diepeveen, Rob Tovey, Tatiana Bubba, M. Benning, C.-B. Schönlieb, O. Öktem, C.E. Yarman


Read more at: Equivariant Neural Networks for Inverse Problems

Equivariant Neural Networks for Inverse Problems

Researchers: Ferdia Sherry, Christian Etmann, Matthias Ehrhardt, Elena Celledoni, Brynjulf Owren, and Carola-Bibiane Schönlieb 


Read more at: Plug-and-Play Proximal Algorithm for Inverse Imaging Problems

Plug-and-Play Proximal Algorithm for Inverse Imaging Problems

Researcher: Angelica Aviles-Rivero, Jingwei Liang, and Carola-Bibiane Schönlieb


Read more at: Provably Convergent Plug-and-Play Quasi-Newton Methods

Provably Convergent Plug-and-Play Quasi-Newton Methods

Researcher: Hong Ye Tan


Read more at: Noise-Free Sampling Algorithms via Regularized Wasserstein Proximals

Noise-Free Sampling Algorithms via Regularized Wasserstein Proximals

Researcher: Hong Ye Tan


Read more at: Accelerated, Stochastic and Equivariant Learned Mirror Descent

Accelerated, Stochastic and Equivariant Learned Mirror Descent

Researcher: Hong Ye Tan


Read more at: Deeply Learned Spectral Total Variation Decomposition

Deeply Learned Spectral Total Variation Decomposition

Researcher: Tamara Großmann