About Me

I am a tenure-track Assistant Professor in the Department of Computer Science at the University of Virginia (UVA). Before joining UVA in 2024, I earned my Ph.D. from the University of Illinois Urbana-Champaign (UIUC), where I was advised by Jiawei Han. During my Ph.D., I also spent time as a visiting researcher with the Princeton NLP Group, working with Danqi Chen.

I am looking for self-motivated PhD students and interns! Please fill out this form if you are interested in working with me. After completing the form, you are also welcome to reach out via email. I will read all submitted forms and emails but I do apologize for not being able to respond to each of them!

Research

My research is dedicated to developing more capable, efficient, and aligned Large Language Models (LLMs). I work across the entire LLM lifecycle, including training paradigms, data and inference efficiency, and the foundations of representations.

Post-Training: Aligning and Enhancing LLMs

My recent work designs better post-training algorithms to improve reasoning, factuality, preference alignment, and model-based evaluation.

Efficiency: Overcoming Data and Inference Bottlenecks

My research addresses critical bottlenecks in data efficiency and inference efficiency, from synthetic data generation to faster decoding.

Foundations of Representation Learning

My work investigates the core principles of representation learning, uncovers limitations in language model representations, and proposes novel pre-training objectives to build more robust and capable foundation models.

News

Education

Ph.D. (2023) Computer Science, University of Illinois Urbana-Champaign
Thesis: Efficient and Effective Learning of Text Representations Award ACM SIGKDD 2024 Dissertation Award
M.S. (2019) Computer Science, University of Illinois Urbana-Champaign
Thesis: Weakly-Supervised Text Classification
B.S. (2017) Computer Engineering, University of Illinois Urbana-Champaign
Graduated with Highest Honor & Bronze Tablet

Contact