Akhil Premkumar

Physics × Generative AI

I'm a postdoc in the Department of Applied Physics at Yale. My work focuses on applying information theory and statistical mechanics to generative AI algorithms, in particular, diffusion models.

I am broadly interested in importing ideas from physics to machine learning. I love cross-pollinating ideas from different disciplines. This approach stems from my core belief that the universe does not self-factorize into distinct academic disciplines. Nature is economical in its creativity; the same structural motifs often reappear in problems that, at first glance, seem unrelated.

I did my PhD in Theoretical Physics from the University of California San Diego under the nurturing guidance of Daniel Green. After that, I spent three eventful years at the University of Chicago under Austin Joyce. It was at Chicago that I became interested in diffusion models. I also benefitted from the mentorship of Lorenzo Orecchia at this time.

Contact

  • Affiliation: Yale University
  • Title: Postdoctoral Associate
  • Email: akhil[dot]prem[at]yale.edu
  • GitHub · Scholar · LinkedIn · Twitter

  • Recent Talk
    Information theory x Machine Learning

Activity Log

December 2, 2025 — NeurIPS 2025: Traveling to San Diego to attend NeurIPS 2025 and present my spolight paper. [Link]
October 15, 2025 — Radboud Universiteit: Gave a long talk on 'Information Theory x Machine Learning' to the Generative Memory Lab at Radboud Universiteit. [Video]
September 10, 2025 — CUNY: Gave a 30 minute talk on 'Neural Entropy' at the AI + Physics workshop at the City College of New York. [Video]
September 1, 2025 — Moved to Yale: Started work at Yale as a Postdoc Associate in the Department of Applied Physics!
July 1, 2025 — Jump Trading: Gave a 1 hour talk on 'Information Theory x Machine Learning' at Jump Tading, New York.
July 1, 2025 — Yale: Gave a 1 hour talk on 'Information Theory x Machine Learning' at the Institute for Foundations of Data Science at Yale. [Event]
November 25, 2024 — MIT: Gave a 1.5 hour talk on 'Neural Entropy' at the Learning on Graphs and Geometry reading group at MIT. [Link]
October 23, 2024 — Caltech: Gave a 1 hour talk on 'An Entropic View of Machine Learning' at the Information, Geometry, and Physics seminar at Caltech.