
Assistant Professor 
 
Department of Electrical and Computer Engineering 
Department of Computer Science and Engineering (Cooperating) 
Department of Statistics (Cooperating) 
University of California, Riverside 
Google Scholar        GitHub
Email: yzhu@ucr.edu 
Office: Winston Chung Hall, Room 431
I am an assistant professor in the ECE department at UC Riverside.
My research focuses on machine learning, reinforcement learning, and foundation models, with an emphasis on developing efficient and reliable learning algorithms and systems for large-scale, multimodal problems. My current interests include:
Interactive learning: leveraging foundation models to design efficient and adaptive algorithms and agents for interactive environments.
Test-time training and scaling: enhancing the performance and robustness of pretrained models during inference, especially on novel or unseen tasks.
Previously, I received my Ph.D. in Computer Sciences from the University of Wisconsin–Madison, where I was advised by Robert Nowak. During my Ph.D., I worked on theoretical foundations of interactive machine learning, including active learning and contextual bandits. I now aim to incorporate these algorithmic insights into the design of practical AI systems — see examples below.
Test-Time Matching: Unlocking Compositional Reasoning in Multimodal Models 
Yinglun Zhu, Jiancheng Zhang, and Fuzhi Tang 
Preprint (under review) 2025, [Blog] [Code]
Online Finetuning Decision Transformers with Pure RL Gradients 
Junkai Luo and Yinglun Zhu 
Preprint (under review) 2025
Strategic Scaling of Test-Time Compute: A Bandit Learning Approach 
Bowen Zuo and Yinglun Zhu 
Preprint (under review) 2025
Towards Multimodal Active Learning: Efficient Learning with Limited Data Pairs 
Jiancheng Zhang and Yinglun Zhu 
Preprint (under review) 2025
LeMix: Unified Scheduling for LLM Training and Inference on Multi-GPU Systems 
Yufei Li, Zexin Li, Yinglun Zhu, Cong Liu 
Real-Time Systems Symposium (RTSS) 2025 
★ Outstanding Paper Award
Efficient Sequential Decision Making with Large Language Models 
Dingyang Chen, Qi Zhang, and Yinglun Zhu 
Conference on Empirical Methods in Natural Language Processing (EMNLP) 2024, [Code]
Active Learning with Neural Networks: Insights from Nonparametric Statistics 
Yinglun Zhu and Robert Nowak 
Conference on Neural Information Processing Systems (NeurIPS) 2022
Efficient Active Learning with Abstention 
Yinglun Zhu and Robert Nowak 
Conference on Neural Information Processing Systems (NeurIPS) 2022
Contextual Bandits with Large Action Spaces: Made Practical 
Yinglun Zhu, Dylan Foster, John Langford, and Paul Mineiro 
International Conference on Machine Learning (ICML) 2022, [Code] 
★ Incorporated into the leading machine learning library Vowpal Wabbit; see here for instructions
Contextual Bandits with Smooth Regret: Computational Efficiency in Continuous Action Spaces 
Yinglun Zhu and Paul Mineiro 
International Conference on Machine Learning (ICML) 2022, [Code] 
★ Full Oral Presentation (top 2.1%) 
Near Instance Optimal Model Selection for Pure Exploration Linear Bandits 
Yinglun Zhu, Julian Katz-Samuels, and Robert Nowak 
International Conference on Artificial Intelligence and Statistics (AISTATS) 2022, [Code]
Pareto Optimal Model Selection in Linear Bandits
Yinglun Zhu and Robert Nowak 
International Conference on Artificial Intelligence and Statistics (AISTATS) 2022, [Code]
Pure Exploration in Kernel and Neural Bandits
Yinglun Zhu\(^\star\), Dongruo Zhou\(^\star\), Ruoxi Jiang\(^\star\), Quanquan Gu, Rebecca Willett, and Robert Nowak 
Conference on Neural Information Processing Systems (NeurIPS) 2021, [Code]
On Regret with Multiple Best Arms
Yinglun Zhu and Robert Nowak 
Conference on Neural Information Processing Systems (NeurIPS) 2020, [Code]
Robust Outlier Arm Identification
Yinglun Zhu, Sumeet Katariya, and Robert Nowak 
International Conference on Machine Learning (ICML) 2020, [Code]
Interactive Machine Learning: From Theory to Scale 
Yinglun Zhu 
Ph.D. Dissertation, University of Wisconsin–Madison, 2023
Ph.D. Students: Jiancheng Zhang (Fall 2024 - present), Bowen Zuo (Fall 2024 - present)
Prospective Students: If you are interested in joining my lab, please apply to the ECE or CSE Ph.D. program at UC Riverside and list me as a potential advisor.
Spring 2025: EE 260 Large Models and Advances in AI
Winter 2025: EE 114 Probability, Random Variables, and Random Processes in Electrical Engineering
Fall 2024: EE/CS 228 Introduction to Deep Learning
Spring 2024: EE 260 Large Models and Advances in AI
Fall 2023: EE/CS 228 Introduction to Deep Learning