Saifuddin Syed

Saifuddin Syed

Department of Statistics

University of Oxford


I am a Florence Nightingale Bicentennial Fellow in computational statistics and machine learning at the University of Oxford’s Department of Statistics, where I am also a member of the Algorithms and Inference Working Group for the Next Generation Event Horizon Telescope (ngEHT). Prior to this role, I completed a postdoc under the guidance of Arnaud Doucet and a PhD under the supervision of Alexandre Bouchard-Côté. My doctoral thesis won the Pierre Robillard Award from the Statistical Society of Canada (SSC), the Cecil Graham Doctoral Dissertation Award from the Canadian Applied and Industrial Mathematics Society (CAIMS), and the Savage Award (Honourable Mention) for theory and methods from the International Society for Bayesian Analysis (ISBA).

My research involves designing scalable and robust algorithms for Bayesian inference with scientific applications in mind. If you have a cool, computationally challenging problem reach out!

  • Annealing
  • Parallel Tempering
  • Markov Chain Monte Carlo
  • Sequential Monte Carlo
  • Scalable Bayesian inference
  • AI for Science
  • PhD in Statistics, 2022

    University of British Columbia

  • MSc in Mathematics, 2016

    University of British Columbia

  • BMath in Pure & Applied Mathematics, 2014

    University of Waterloo


Many of the state-of-the-art algorithms in statistics, and machine learning, utilize a technique called annealing, which involves making inferences from an intractable target problem by incrementally deforming solutions from a tractable reference problem. I’m am interested in using annealing as a tool to understand the interplay between MCMC, SMC, variational inference, diffusion models, normalizing flows, and optimal transport.

(2023). Pigeons.jl: Distributed sampling from intractable distributions. Arxiv Preprint.

PDF Cite Arxiv

(2023). Local Exchangeability. Bernoulli.

PDF Cite Source Document Arxiv

(2023). A Unified Framework for U-Net Design and Analysis. Arxiv Preprint.

PDF Cite Arxiv

(2022). Parallel tempering with a variational reference. Conference on Neural Information Processing Systems.

PDF Cite Poster Source Document Arxiv

(2021). Non-reversible parallel tempering: a scalable highly parallel MCMC scheme. Journal of the Royal Statistical Society (Series B).

PDF Cite Poster Video Source Document Arxiv

(2021). Parallel tempering on optimized paths. International Conference on Machine Learning.

PDF Cite Poster Slides Video Source Document Arxiv


  • +44 7467 304999
  • 24-29 St Giles, Oxford, CA OX1 3LB
  • Office 1.18