Research Themes

Dynamical Measure Transport

Dynamical measure transport, continuous-in-time maps between distributions, has become a central topic of contemporary large-scale generative modeling. Part of my work focuses on fleshing out the mathematics of this topic in relation to generative modeling, with special emphasis on how this helps us build performant, controllable tools. We have devised new algorithms, most notably what is known as flow matching and, more broadly, a unifying framework for connecting flows and diffusions which we call stochastic interpolants. We are now making progress on completing the picture of this topic, deriving the equations and algorithms that allow us to make these models more efficient, but also controllable via principled approaches to fine-tuning. This new directional relates these dynamical systems to reinforcement learning algorithms. Along the way, we have been building on the theory of measure transport to arbitrary spaces, e.g. between distributions defined on a finite (discrete) state space, with important outlooks for more efficient language modeling and new notions of reasoning.

Sampling and Probabilistic Inference

Algorithmic advances in generative modeling have largely been dependent on assumptions of access to an abundance of samples from target distributions of interest that we want to model. In the case of language or image generation, we learn from the endless stream of textual and visual data available to us. In the sciences, however, an opposite circumstance is often encountered, where data is scarce and hard to obtain, but inductive structure in a learning problem is available from the theory. Another branch of our work is to figure out how to learn a generative model for problems in the sciences where little to no data is available, but maybe an energy function defining a molecule or physical theory is. How do we adapt our learning algorithms to effectively utilize this information? These mathematics also play an important role in probabilistic inference, Monte Carlo, and inference-time adaptation of generative models.

Biology????

let's see...

Want to join our research group?

I am seeking highly motivated graduate students and postdocs to join my group starting in Fall 2026. We will work on principled algorithms that make generative models scalable, understandable, and useful across the sciences and beyond.

Importantly, we aim to be interdisciplinary and very collaborative, learning a lot from each other!

mathematicians → computer scientists → wet lab scientists → poets → ... woo!

Nobody is considered too far afield or behind, but you have to want to learn from and teach your peers and me.

The joy of navigating these unprecedented research frontiers is doing it together!

Graduate Students

Please apply at this link and write to me that you are applying.

Postdocs

Please reach out to me directly and consider applying to the Kempner Institute Research Fellowship.

Reach out detailing your interests at either email