The Proceedings of IEEE posted our review article today on Streaming PCA and Subspace Tracking with Missing Data. It was a great experience to work with Yuejie Chi and Yue Lu on this survey. You can also find a less pretty version on the arxiv.
The Proceedings of IEEE posted our review article today on Streaming PCA and Subspace Tracking with Missing Data. It was a great experience to work with Yuejie Chi and Yue Lu on this survey. You can also find a less pretty version on the arxiv.
At the inaugural Conference on Parsimony and Learning (CPAL), my group is presenting three works that have come out of a recent exciting collaboration with UM Prof Qing Qu and other colleagues on low-rank learning in deep networks. Prof Qu’s prior work studying neural collapse in deep networks has opened many exciting directions for us to pursue! All three works study deep linear networks (DLNs), i.e. deep matrix factorization. In this setting (which is simplified from deep neural networks that have nonlinear activations), we can prove several interesting fundamental facts about the way DLNs learn from data when trained with gradient descent. Congratulations SPADA members Soo Min Kwon, Can Yaras, and Peng Wang (all co-advised by Prof Qu) for these publications!
Yaras, C., Wang, P., Hu, W., Zhu, Z., Balzano, L., & Qu, Q. (2023, December 1). Invariant Low-Dimensional Subspaces in Gradient Descent for Learning Deep Linear Networks. Conference on Parsimony and Learning (Recent Spotlight Track). https://openreview.net/forum?id=oSzCKf1I5N
Wang, P., Li, X., Yaras, C., Zhu, Z., Balzano, L., Hu, W., & Qu, Q. (2023, December 1). Understanding Hierarchical Representations in Deep Networks via Feature Compression and Discrimination. Conference on Parsimony and Learning (Recent Spotlight Track). https://openreview.net/forum?id=Ovuu8LpGZu
Kwon, S. M., Zhang, Z., Song, D., Balzano, L., & Qu, Q. (2023, December 1). Efficient Low-Dimensional Compression of Overparameterized Networks. Conference on Parsimony and Learning (Recent Spotlight Track). https://openreview.net/forum?id=1AVb9oEdK7
Last fall and winter, SPADA PhD students Kyle Gilman and Zhe Du graduated. Kyle’s thesis was titled “Scalable Algorithms Using Optimization on Orthogonal Matrix Manifolds,” and he continues to make fundamental contributions to interesting modern optimization problems. He is currently an Applied AI/ML Senior Associate at JPMorgan Chase. Zhe’s thesis was titled “Learning, Control, and Reduction for Markov Jump Systems,” with lots of interesting work at the intersection of machine learning and control. He is currently a Postdoctoral researcher working with Samet Oymak and Fabio Pasqualetti. I am excited to follow their work into the future as they make an impact in optimization, machine learning, and control!
I am honored to have received an MLK Spirit Award from the Michigan College of Engineering. These awards are given to university members who exemplify the leadership and vision of Reverend Dr. Martin Luther King, Jr. through their commitment to social justice, diversity, equity, and inclusion. That commitment is a very high priority for me, so I am grateful that others have felt the impact of my actions. https://ece.engin.umich.edu/stories/laura-balzano-receives-2023-mlk-spirit-award
Copyright © 2024 · Streamline Child Theme on Genesis Framework · WordPress · Log in