I am a first-year PhD student advised by Prof. Rose Yu and Prof. Loris D'Antoni. My research interests broadly lie at the intersection between machine learning and formal methods. Recently, I am particularly focusing on controllable generation with Large Language Models, with applications including (but not lmited to) AI4Science and Code Generation.
Previously, I have worked with Prof. Guy Van den Broeck at UCLA on leveraging Tractable Probabilistic Models for controllable generation.
") does not match the recommended repository name for your site ("").
", so that your site can be accessed directly at "http://".
However, if the current repository name is intended, you can ignore this message by removing "{% include widgets/debug_repo_name.html %}" in index.html.
",
which does not match the baseurl ("") configured in _config.yml.
baseurl in _config.yml to "".

Ian Li, Zilei Shao, Benjie Wang, Rose Yu, Guy Van den Broeck, Anji Liu
Under review. 2026
We propose Coupled Discrete Diffusion (CoDD), a hybrid framework that breaks this barrier by replacing the fully-factorized output distribution with a lightweight, tractable probabilistic inference layer.
Ian Li, Zilei Shao, Benjie Wang, Rose Yu, Guy Van den Broeck, Anji Liu
Under review. 2026
We propose Coupled Discrete Diffusion (CoDD), a hybrid framework that breaks this barrier by replacing the fully-factorized output distribution with a lightweight, tractable probabilistic inference layer.

Gwen Yidou-Weng, Ian Li, Anji Liu, Oliver Broadrick, Guy Van den Broeck, Benjie Wang
ArXiv 2025
We propose Learning to Look Ahead (LTLA), a hybrid approach that pairs the same base language model for rich prefix encoding with a fixed tractable surrogate model that computes exact continuation probabilities.
Gwen Yidou-Weng, Ian Li, Anji Liu, Oliver Broadrick, Guy Van den Broeck, Benjie Wang
ArXiv 2025
We propose Learning to Look Ahead (LTLA), a hybrid approach that pairs the same base language model for rich prefix encoding with a fixed tractable surrogate model that computes exact continuation probabilities.

Ian Li, Philip Chen, Max Huang, Andrew Park, Loris D'Antoni, Rose Yu
FoRLM @ NeurIPS 2025ArXiv 2025
We introduce Activation State Machine (ASM), an lightweight dynamic steering mechanism that learns the latent dynamics of ideal reasoning trajectories and applies context-aware interventions at inference time.
Ian Li, Philip Chen, Max Huang, Andrew Park, Loris D'Antoni, Rose Yu
FoRLM @ NeurIPS 2025ArXiv 2025
We introduce Activation State Machine (ASM), an lightweight dynamic steering mechanism that learns the latent dynamics of ideal reasoning trajectories and applies context-aware interventions at inference time.