5th-year Ph.D. candidate at EECS, University of Michigan, Ann Arbor. Research centers on building solid theoretical foundations and designing efficient learning procedures for complex tasks, such as managing label noise, learning from aggregated data, neural stochastic differential equations, and flow-based generative models. I will join Meta in September 2025 as a Machine Learing Research Scientist.
Publications:
- Scott, Clayton, Zhang, Jianxin. Learning from Label Proportions: A Mutual Contamination Framework. 34th Conference on Neural Information Processing Systems (NeurIPS 2020).
- Zhang, Jianxin, Wang, Yutong, Scott, Clayton. Learning from Label Proportions by Learning with Label Noise. Conference on Neural Information Processing Systems (NeurIPS 2022).
- Zhang, Jianxin, Scott, Clayton. (2023). Label Embedding via Low-Coherence Matrices. arXiv. https://arxiv.org/abs/2305.19470.
- Yilun Zhu, Jianxin Zhang, Aditya Gangrade, Clayton Scott. (2024). Label Noise: Ignorance Is Bliss. Conference on Neural Information Processing Systems (NeurIPS 2024).
- Zhang, Jianxin, Viktorov, Josh, Jung, Doosan, Pitler, Emily. (2024). Efficient Neural SDE By Matching Finite Dimensional Distributions. The Thirteenth International Conference on Learning Representations (ICLR 2025).