Research: As a researcher in machine learning, I am focused on developing scalable methods for modeling, analyzing and generating complex, high-dimensional data. My interest spans multiple areas, including generative modeling, representation learning, probabilistic inference, AI safety, and AI for science. My ultimate goal is to address problems that have wide-ranging significance, develop methods that are both accessible and effective, and build intelligent systems that can improve human lives.
Previously: I received my Ph.D. in Computer Science from Stanford University, advised by Stefano Ermon. I was a research intern at Google Brain, Uber ATG, and Microsoft Research. I obtained my Bachelor’s degree in Mathematics and Physics from Tsinghua University, where I worked with Jun Zhu, Raquel Urtasun, and Richard Zemel.
|Aug 30, 2022||We are organizing a workshop on score-based methods at NeurIPS 2022. Check out our website for details!|
|Jul 30, 2022||I will be joining the Department of Electrical Engineering (EE) and Department of Computing + Mathematical Sciences (CMS) at California Institute of Technology as an Assistant Professor starting from January 2024. I am currently seeking self-motivated Ph.D. students and postdoctoral fellows to join my research group in Fall 2023. Candidates with backgrounds in CS, EE, mathematics, physics, or statistics are especially encouraged to apply. If you are interested in working with me as a Ph.D. student, please apply to Caltech CMS/EE and mention my name in your applications. If you are interested in a postdoc position, please contact me directly.|
selected publications [full list](*) denotes equal contribution
- ICLRSolving Inverse Problems in Medical Imaging with Score-Based Generative ModelsThe 10th International Conference on Learning Representations, 2022. Abridged in the NeurIPS 2021 Workshop on Deep Learning and Inverse Problems.
- NeurIPSSpotlightMaximum Likelihood Training of Score-Based Diffusion ModelsThe 35th Conference on Neural Information Processing Systems, 2021.Spotlight Presentation [top 3%]
- ICMLAccelerating Feedforward Computation via Parallel Nonlinear Equation SolvingThe 38th International Conference on Machine Learning, 2021.
- ICLROralAwardScore-Based Generative Modeling through Stochastic Differential EquationsThe 9th International Conference on Learning Representations, 2021.Outstanding Paper Award
- NeurIPSImproved Techniques for Training Score-Based Generative ModelsThe 34th Conference on Neural Information Processing Systems, 2020.
- AISTATSPermutation Invariant Graph Generation via Score-Based Generative ModelingThe 23rd International Conference on Artificial Intelligence and Statistics, 2020.
- NeurIPSOralGenerative Modeling by Estimating Gradients of the Data DistributionThe 33rd Conference on Neural Information Processing Systems, 2019.Oral Presentation [top 0.5%]
- UAIOralSliced Score Matching: A Scalable Approach to Density and Score EstimationThe 35th Conference on Uncertainty in Artificial Intelligence, 2019.Oral Presentation [top 8.7%]
- NeurIPSConstructing Unrestricted Adversarial Examples with Generative ModelsThe 32nd Conference on Neural Information Processing Systems, 2018.
- ICLRPixelDefend: Leveraging Generative Models to Understand and Defend against Adversarial ExamplesThe 6th International Conference on Learning Representations, 2018.