$\mathbb{N}$icola $\mathbb{B}$ranchini

∫ ∑ 𝔼

Nicola Branchini

PhD Student in Statistics at the University of Edinburgh working with Víctor Elvira

ELLIS Student working with Aki Vehtari at Aalto University

📄 Resume
Nicola Branchini
About

Hello! I am a PhD student in Statistics in the School of Mathematics at the University of Edinburgh, advised by Víctor Elvira. Sometimes I blog as well here.

I am also an ELLIS PhD student, working with and co-supervised by Aki Vehtari in Aalto University.

My real interests are broad, spanning computational statistics and statistical/probabilistic machine learning, with a focus on methodology. For my PhD, I have been focussing on developing methodology in Monte Carlo (MC), with a particular focus on importance sampling (IS).

IS can be seen as understanding MC integration as optimization over probability densities. This is relevant in a number of applications beyond the obvious Bayesian computation.

I like collaborating with people. Feel free to drop me an email (and to ping me again if I do not reply).

News
  • Very excited to soon join the University of Warwick as a ProbAI research fellow, where I will work with Gareth Roberts.
  • Three new papers in 2026: How to approximate inference with subtractive mixture models (AISTATS 2026), On the bias of variational resampling (AISTATS 2026), and Multimarginal Flow Matching with Adversarially Learnt Interpolants (ICLR 2026). See more details on the publications page.
  • Our conference paper Towards Adaptive Self-Normalized Importance Samplers is accepted at the Statistical Signal Processing Workshop (SSP), 2025.
Professional Activities

Reviewing

Journals

Statistics and Computing, Statistics and Probability Letters

Conferences

AISTATS 2023, AABI (workshop) 2023, NeurIPS 2023, ICLR 2024, AISTATS 2024, NeurIPS workshop on Bayesian decision making and uncertainty 2024, AISTATS 2025

Talks & Posters

  • Contributed talk on "Generalized Self Normalized Importance Sampling" at the 14th international conference on Monte Carlo methods and applications (MCM) 2023
  • Poster on "Generalized Self Normalized Importance Sampling" at BayesComp 2023.
  • Poster on "Causal Entropy Optimisation" at Greek Stochastics.
  • Poster: Optimized Auxiliary Particle Filters: adapting mixture proposals via convex optimization, at 5th Workshop on Sequential Monte Carlo methods, May 2022.
  • Poster: Optimized Auxiliary Particle Filters: adapting mixture proposals via convex optimization, at "Bayes at CIRM" Winter School, Centre International de Rencontres Mathématiques, Marseille, October 2021
  • Poster: Optimized Auxiliary Particle Filters: adapting mixture proposals via convex optimization at 37th Conference on Uncertainty in Artificial Intelligence (UAI), online, 2021.
Awards

🏆 Feuer International Scholarship in Artificial Intelligence

Full funding (declined)

🎓 School of Mathematics Studentship

University of Edinburgh (full funding)

🥇 Dissertation Prize

Artificial Intelligence MSc, University of Edinburgh

⭐ Outstanding Dissertation Award

BSc Computer Science, University of Warwick

Inspiration
Basically, I'm not interested in doing research and I never have been. I'm interested in understanding, which is quite a different thing. And often to understand something you have to work it out yourself because no one else has done it
— David Blackwell
Getting numbers is easy; getting numbers you can trust is hard.
— Ron Kohavi, Diane Tang, Ysa Xu (from "Trustworthy Online Controlled Experiments")
Background

Previously, I was a Research Assistant at the Alan Turing Institute, working within the Warwick Machine Learning Group and supervised by Prof. Theo Damoulas.

Previous to that, I was a Master's student in the School of Informatics at the University of Edinburgh where I was supervised by Prof. Víctor Elvira working on auxiliary particle filters.

As undergrad, I studied Computer Science at the University of Warwick, where I did my BSc dissertation on reproducing AlphaZero supervised by Dr. Paolo Turrini.

Recommended Reading