Changbin Li

About Me

๐Ÿ‘‹ Hello! I'm Changbin Li, I obtained my Ph.D. in Computer Science (Intelligent Systems) at the University of Texas at Dallas, where I fortunately worked with Professor Feng Chen.

๐Ÿ” My research focuses on low-resource data-efficient learning, hyperparameter optimization, secure learning, and efficient and robust learning in AI systems. I'm also passionate about generative AI (large language models (LLMs), and diffusion models), Multimodal Foundation Models and probabilistic graphic models.

๐Ÿš€ Now I am on the lookout for a full-time role (Research Scientist/Engineer, Applied Scientist, Post-doc, etc) in 2024.

Selected Publications (all)

iclr
Hyper Evidential Deep Learning to Quantify Composite Classification Uncertainty

Changbin Li, Kangshuo Li, Yuzhe Ou, Lance Kaplan, Audun Jรธsang, Jin-hee Cho, Dong-hyun Jeong and Feng Chen.

International Conference on Learning Representations (ICLR), 2024

icml
PLATINUM: Semi-Supervised Model Agnostic Meta-Learning using Submodular Mutual Information

Changbin Li*, Suraj Kothawade*, Feng Chen, Rishabh Iyer.

International Conference on Machine Learning (ICML), 2022

aaai
A Nested Bi-Level Optimization for Robust Few Shot Learning

Krishnateja Killamsetty*, Changbin Li*, Chen Zhao, Rishabh Iyer, Feng Chen. (* equal contribution)

AAAI Conference on Artificial Intelligence (AAAI), 2022

Professional Experience

Sunnyvale, CA
  • Worked on the Knowledge Transfer within Foundation Models, improving the knowledge transfer ratio from 5% to 27%, and further to 40%.

Sunnyvale, CA
  • Developed and optimized efficient training and inference methods, such as parameter-efficient fine-tuning and sparsity techniques for Multimodal Foundation Models, like CLIP.
  • Enhanced capabilities in Zero-Shot Out-Of-Distribution (OOD) detection on seven ID datasets and four OOD datasets.

Menlo Park, CA
  • Worked with cross-functional teams: AI Platform, AI Infra, and Ads Ranking, to enhance Ads recommendation/ranking systems.
  • Innovative use of pre-trained models in Recurring Transfer Learning.
  • Researched and implemented an advanced Multimodal Recurring Transfer Learning framework: Table Fusion.

Irving, TX
  • Collaborated with scientists on applying machine learning approaches in sequential power forecasting.
  • Refined Gradient Boosting Model (lightGBM, etc.) for time-series forecasting, e.g., load, wind, and solar power of quantile regression in the energy market.
  • Achieved comparable performance with established models.

Beijing, China
  • Developed and deployed deep learning algorithms (AlexNet on Caffe framework) to recognize different webpage blocks (83% accuracy), and LSTM for the application of part of speech tagger and sentiment analysis.

Honors & Awards

  • 2022 Travel Award, Conference on Neural Information Processing Systems (NeurIPS)
  • 2022 Participation Grant, International Conference on Machine Learning (ICML)
  • 2021 Outstanding Teaching Assistant Award, The University of Texas at Dallas
  • Qiuqi Graduate Scholarship (1/600), Beijing Jiaotong University
  • National Scholarship for Encouragement, Ministry of Education China
  • Excellent Student Award (5%), Beijing University of Chemical Technology
  • Peopleโ€™s Scholarship, Beijing University of Chemical Technology

Teaching Experience

  • CS6375 Machine Learning (Spring'19, '21, Fall'18, '19, '21)
  • CS6364 Artificial Intelligence (Fall'20)
  • CS5343 Data Structure and Algorithms (Java) (Fall'20)

Academic Services

Program Committee (PC) member/Reviewer

Conference: ICLR '24-'25, ICML '22-'24, NeurIPS '21-'23, CVPR '24-'25, AAAI '23-'24, KDD '20-'24, SDM '22, AISTATS '21, UAI '21, AutoML '21

Journal: Transactions on Pattern Analysis and Machine Intelligence (TPAMI), Transactions on Knowledge Discovery from Data (TKDD), Big Data Research

Last updated on Feb 18 2024