
Ian Li, Zilei Shao, Benjie Wang, Rose Yu, Guy Van den Broeck, Anji Liu
Under review. 2026
We propose Coupled Discrete Diffusion (CoDD), a hybrid framework that breaks this barrier by replacing the fully-factorized output distribution with a lightweight, tractable probabilistic inference layer.
Ian Li, Zilei Shao, Benjie Wang, Rose Yu, Guy Van den Broeck, Anji Liu
Under review. 2026
We propose Coupled Discrete Diffusion (CoDD), a hybrid framework that breaks this barrier by replacing the fully-factorized output distribution with a lightweight, tractable probabilistic inference layer.

Gwen Yidou-Weng, Ian Li, Anji Liu, Oliver Broadrick, Guy Van den Broeck, Benjie Wang
ArXiv 2025
We propose Learning to Look Ahead (LTLA), a hybrid approach that pairs the same base language model for rich prefix encoding with a fixed tractable surrogate model that computes exact continuation probabilities.
Gwen Yidou-Weng, Ian Li, Anji Liu, Oliver Broadrick, Guy Van den Broeck, Benjie Wang
ArXiv 2025
We propose Learning to Look Ahead (LTLA), a hybrid approach that pairs the same base language model for rich prefix encoding with a fixed tractable surrogate model that computes exact continuation probabilities.

Ian Li, Philip Chen, Max Huang, Andrew Park, Loris D'Antoni, Rose Yu
FoRLM @ NeurIPS 2025ArXiv 2025
We introduce Activation State Machine (ASM), an lightweight dynamic steering mechanism that learns the latent dynamics of ideal reasoning trajectories and applies context-aware interventions at inference time.
Ian Li, Philip Chen, Max Huang, Andrew Park, Loris D'Antoni, Rose Yu
FoRLM @ NeurIPS 2025ArXiv 2025
We introduce Activation State Machine (ASM), an lightweight dynamic steering mechanism that learns the latent dynamics of ideal reasoning trajectories and applies context-aware interventions at inference time.