Michael Hu

Profile Picture

I am a fourth-year PhD student at the NYU Center for Data Science, advised by Kyunghyun Cho and Tal Linzen. I am supported by the NSF Graduate Research Fellowship.

I study the science of training language models, with a focus on data.

Lately, I've started thinking about self-improving models and automating ML research.
Relevant: [Neural neural scaling laws].

Previously, I completed a BSE at Princeton CS, where I spent two lovely years working with Karthik Narasimhan and Tom Griffiths. I then joined Yobi AI for two years as the first employee. In my spare time, I enjoy cooking, running, and playing basketball.

News

Feb 2026 New paper: Neural Neural Scaling Laws.
Sep 2025 Gave a talk on data mixing at Genesis Molecular AI.
Aug 2025 Gave a talk on Aioli and pre-pretraining at Harvard ML Foundations and MIT Language & Intelligence Lab.
Jul 2025 Pre-pretraining won an Outstanding Paper Award at ACL 2025! 🏅
Jul 2025 New paper: Scaling Laws Are Unreliable for Downstream Tasks.
Spr 2025 Gave talks on pre-pretraining at École Normale Supérieure CoML, FLaNN, Ryco Lab Reading Group, and CDS Seminar.