Changbin Li

About Me

đź‘‹ I'm a Ph.D. candidate diving deep into the world of AI and ML with years of hands-on experience. My expertise spans machine learning, deep learning, computer vision, and more.

🔍 My research focuses on low-resource data-efficient learning, hyperparameter optimization, secure learning, and efficient learning in AI systems. I'm also passionate about generative AI (large language models (LLMs), and diffusion models), Multimodal Foundation Models and probabilistic graphic models.

🎓 Currently pursuing my Ph.D. at the University of Texas at Dallas under Prof. Feng Chen's guidance.

🚀 I will be a Research Scientist Intern at Meta AI this summer. Now on the lookout for a full-time role (Research Scientist/Engineer, Post-doc, etc) in 2024.

Selected Publications (all)

iclr
Hyper Evidential Deep Learning to Quantify Composite Classification Uncertainty

Changbin Li, Kangshuo Li, Yuzhe Ou, Lance Kaplan, Audun Jøsang, Jin-hee Cho, Dong-hyun Jeong and Feng Chen.

International Conference on Learning Representations (ICLR), 2024

icml
PLATINUM: Semi-Supervised Model Agnostic Meta-Learning using Submodular Mutual Information

Changbin Li*, Suraj Kothawade*, Feng Chen, Rishabh Iyer.

International Conference on Machine Learning (ICML), 2022

aaai
A Nested Bi-Level Optimization for Robust Few Shot Learning

Krishnateja Killamsetty*, Changbin Li*, Chen Zhao, Rishabh Iyer, Feng Chen. (* equal contribution)

AAAI Conference on Artificial Intelligence (AAAI), 2022

Professional Experience

Sunnyvale, CA
  • Developed and optimized efficient training and inference methods, such as parameter-efficient fine-tuning and sparsity techniques for Multimodal Foundation Models, like CLIP.
  • Enhanced capabilities in Zero-Shot Out-Of-Distribution (OOD) detection on seven ID datasets and four OOD datasets.

Menlo Park, CA
  • Worked with cross-functional teams: AI Platform, AI Infra, and Ads Ranking, to enhance Ads recommendation/ranking systems.
  • Innovative use of pre-trained models in Recurring Transfer Learning.
  • Researched and implemented an advanced Multimodal Recurring Transfer Learning framework: Table Fusion.

Irving, TX
  • Collaborated with scientists on applying machine learning approaches in sequential power forecasting.
  • Refined Gradient Boosting Model (lightGBM, etc.) for time-series forecasting, e.g., load, wind, and solar power of quantile regression in the energy market.
  • Achieved comparable performance with established models.

Beijing, China
  • Developed and deployed deep learning algorithms (AlexNet on Caffe framework) to recognize different webpage blocks (83% accuracy), and LSTM for the application of part of speech tagger and sentiment analysis.

Honors & Awards

  • 2022 Travel Award, Conference on Neural Information Processing Systems (NeurIPS)
  • 2022 Participation Grant, International Conference on Machine Learning (ICML)
  • 2021 Outstanding Teaching Assistant Award, The University of Texas at Dallas
  • Qiuqi Graduate Scholarship (1/600), Beijing Jiaotong University
  • National Scholarship for Encouragement, Ministry of Education China
  • Excellent Student Award (5%), Beijing University of Chemical Technology
  • People’s Scholarship, Beijing University of Chemical Technology

Teaching Experience

  • CS6375 Machine Learning (Spring’19, ’21, Fall’18, ’19, ’21)
  • CS6364 Artificial Intelligence (Fall’20)
  • CS5343 Data Structure and Algorithms (Java) (Fall’20)

Academic Services

Program Committee (PC) member/Reviewer

Conference: CVPR '24, ICLR '24, ICML '22 '24, NeurIPS '21 '23, AAAI '23-'24, KDD '20-'24, SDM '22, AISTATS '21, UAI '21, AutoML '21

Journal: Transactions on Pattern Analysis and Machine Intelligence (TPAMI), Transactions on Knowledge Discovery from Data (TKDD), Big Data Research

Last updated on FEB 08 2024