Assistant Professor
Department of Electrical and Computer Engineering
Department of Computer Science and Engineering (Cooperating faculty)
University of California, Riverside
Google Scholar
GitHub
Email: yzhu@ucr.edu
Office: Winston Chung Hall, Room 431
I am an assistant professor in the Department of Electrical and Computer Engineering at the University of California, Riverside. In 2023, I got my Ph.D. in Computer Sciences at the University of Wisconsin–Madison, where I was fortunate to be advised by Robert Nowak. In 2021, I spent a wonderful summer at Microsoft Research NYC, working with Paul Mineiro, Dylan Foster, and John Langford.
I work on interactive machine learning (e.g., active learning, bandits, and reinforcement learning), where my goal is to develop efficient human-in-the-loop learning algorithms and systems. Recently, I became interested in connecting interactive machine learning with foundation models (e.g., large language models), from both algorithmic and systemic perspectives. Specifically, some research topics I would like to further explore include:
AI safety and alignment (e.g., using reinforcement learning)
LLM-powered interactive learning agents and systems
Efficient training and inference for large models
Testing and evaluation systems for large models
Theoretical foundations of transformer / attention / interactive ML
\(\bigstar\) I am actively looking for Ph.D. students and visitors (in-person or remote) who are interested in the above mentioned directions. Please (i) fill out this form and (ii) send me an email with your CV/transcripts if you are interested in working with me!
\(\bigstar\) Prof. Cong Liu and I are jointly hiring Ph.D. students who are generally interested in Machine Learning Systems. Please (i) fill out this form and (ii) send both of us an email with your CV/transcripts if you are interested in working on MLSys with us!
Infinite Action Contextual Bandits with Reusable Data Exhaust
Mark Rucker, Yinglun Zhu, and Paul Mineiro
International Conference on Machine Learning (ICML) 2023
Active Learning with Neural Networks: Insights from Nonparametric Statistics
Yinglun Zhu and Robert Nowak
Conference on Neural Information Processing Systems (NeurIPS) 2022
Efficient Active Learning with Abstention
Yinglun Zhu and Robert Nowak
Conference on Neural Information Processing Systems (NeurIPS) 2022
Contextual Bandits with Large Action Spaces: Made Practical
Yinglun Zhu, Dylan Foster, John Langford, and Paul Mineiro
International Conference on Machine Learning (ICML) 2022, [Code], [Spotlight talk, 6 min]
\(\bigstar\) Now available at the leading machine learning library Vowpal Wabbit (see here for instructions) and commercially incorporated into Microsoft Azure Personalizer!
Contextual Bandits with Smooth Regret: Computational Efficiency in Continuous Action Spaces
Yinglun Zhu and Paul Mineiro
International Conference on Machine Learning (ICML) 2022, [Code]
\(\bigstar\) Selected for a full oral presentation (top 2.1%),
[Oral talk, 17 min]
Near Instance Optimal Model Selection for Pure Exploration Linear Bandits
Yinglun Zhu, Julian Katz-Samuels, and Robert Nowak
International Conference on Artificial Intelligence and Statistics (AISTATS) 2022, [Code]
Pareto Optimal Model Selection in Linear Bandits
Yinglun Zhu and Robert Nowak
International Conference on Artificial Intelligence and Statistics (AISTATS) 2022, [Code]
Pure Exploration in Kernel and Neural Bandits
Yinglun Zhu\(^\star\), Dongruo Zhou\(^\star\), Ruoxi Jiang\(^\star\), Quanquan Gu, Rebecca Willett, and Robert Nowak
Conference on Neural Information Processing Systems (NeurIPS) 2021, [Code]
On Regret with Multiple Best Arms
Yinglun Zhu and Robert Nowak
Conference on Neural Information Processing Systems (NeurIPS) 2020, [Code]
Robust Outlier Arm Identification
Yinglun Zhu, Sumeet Katariya, and Robert Nowak
International Conference on Machine Learning (ICML) 2020, [Code]
Co-organizer: SILO Seminar at UW–Madison (2022 - 2023)
Conference Reviewer:
Conference on Neural Information Processing Systems (NeurIPS)
\(\bigstar\) Outstanding Reviewer Award in 2021
International Conference on Machine Learning (ICML)
International Conference on Learning Representations (ICLR)
International Conference on Artificial Intelligence and Statistics (AISTATS)
International Symposium on Information Theory (ISIT)
Journal Reviewer:
Journal of Machine Learning Research (JMLR)
Transactions on Machine Learning Research (TMLR)
Machine Learning Journal (MLJ)
Transactions on Pattern Analysis and Machine Intelligence (TPAMI)
Instructor, University of California, Riverside:
CS/ECE 228 Introduction to Deep Learning, Fall 2023
Teaching Assistant, University of Wisconsin–Madison:
CS/ECE 761 Mathematical Foundations of Machine Learning, Spring 2020 (Head TA), Spring 2022
CS/ECE/ME 532 Matrix Methods in Machine Learning, Fall 2019
CS 412 Introduction to Numerical Methods, Fall 2018
CS/MATH 513 Numerical Linear Algebra, Spring 2018