skip to main content
Caltech

H.B. Keller Colloquium

Monday, May 4, 2026
4:00pm to 5:00pm
Add to Cal
Annenberg 105
Low-Dimensional Generation
Nisha Chandramoorthy, Assistant Professor, Department of Statistics, The University of Chicago,

In any Generative Model, the generated samples have a different distribution than the data distribution, due to inevitable learning errors. Moreover, this discrepancy, and metrics for evaluating the generated samples, are hard to characterize in high dimensions, motivating the need to understand how learning errors affect the reproducibility of certain "features" of the generated distributions.  A first question is whether generative models produce "physical" samples, i.e., samples whose support is close to that of the true target distribution, despite algorithmic errors. A second question concerns what we term a lazy generative model. Consider a noising process in which given samples from the target, we apply an arbitrary random dynamical system such that the distribution at finite time is approximately Gaussian. In principle, this noising process cannot be exactly inverted to recover target samples—but if we probe the noised samples, we will find that some statistics or observables are still recoverable. What statistics are recoverable and how do we exploit this recoverability to perform lazy generative modeling without any denoising?
The first part is joint work with Adriaan de Clercq (UChicago) and the second with Georg Gottwald (U Sydney).

For more information, please contact Narin Seraydarian by phone at (626) 517-6580 or by email at [email protected].