$\mathbb{N}$icola $\mathbb{B}$ranchini

∫ ∑ 𝔼

Nicola Branchini

PhD Student in Statistics at the University of Edinburgh working with Víctor Elvira

ELLIS Student working with Aki Vehtari at Aalto University

📄 Resume
Nicola Branchini
About

Hello! I am a PhD student in Statistics in the School of Mathematics at the University of Edinburgh, advised by Víctor Elvira. Sometimes I blog as well here.

I am also an ELLIS PhD student, working with and co-advised by Aki Vehtari in Aalto University.

My real interests are broad, spanning computational statistics and statistical/probabilistic machine learning, with a focus on methodology. For my PhD, I have been focussing on developing methodology in Monte Carlo.

A sample of some specific interests:

  • Importance sampling – The concept of IS underpins most computational methods where the interest is in a consistent estimator of a quantity of interest. I'm interested in all kinds of aspects of IS, such as dimension scaling, good choices of proposal densities, robustness to inexact function evaluations, and practical diagnostics.
  • Bayesian cross-validation – The Bayesian counterpart to classical cross-validation presents an interesting setup for IS and computational methods, and I'm interested in extensions and theory around Bayesian CV.
  • Heavy tails – Approximating distributions with heavy tails is challenging and still a very active research area. I'm also interested in using concepts from heavy tails to understand the properties of MC estimators.
  • …and adjacent ideas – I enjoy picking up new methods when they help answer interesting questions.

Please do not hesitate to contact me for talking about research.

News
  • Very excited to join (soon) the University of Warwick as a ProbAI research fellow, where I will work with Gareth Roberts.
  • Three new papers in 2026: How to approximate inference with subtractive mixture models (AISTATS 2026), On the bias of variational resampling (AISTATS 2026), and Multimarginal Flow Matching with Adversarially Learnt Interpolants (ICLR 2026). See more details on the publications page.
  • Our conference paper Towards Adaptive Self-Normalized Importance Samplers is accepted at the Statistical Signal Processing Workshop (SSP), 2025.
Reviewing

Journals

Statistics and Computing, Transactions on Machine Learning Research, Statistics and Probability Letters

Conferences & Workshops

AISTATS 2023, AABI (workshop) 2023, NeurIPS 2023, ICLR 2024, AISTATS 2024, NeurIPS workshop on Bayesian decision-making and uncertainty 2024, AISTATS 2025

Nice quotes
Basically, I'm not interested in doing research and I never have been. I'm interested in understanding, which is quite a different thing. And often to understand something you have to work it out yourself because no one else has done it
— David Blackwell
Getting numbers is easy; getting numbers you can trust is hard.
— Ron Kohavi, Diane Tang, Ysa Xu (from "Trustworthy Online Controlled Experiments")