Productive spring

This spring I’ve had a couple opportunities to share work with Ravi Sastry Ganti and Rebecca Willett on Matrix completion under monotonic single index model observations that we had at NIPS last December. In a week I’ll be at AI stats with my student Dejiao Zhang presenting our work on Global convergence of a Grassmannian gradient descent algorithm for subspace learning — we now have a proof that the GROUSE subspace learning algorithm converges globally from any random initialization to the global minimizer, or to a ball around the generating subspace in the case of noisy data. And last but never least, I had a great set of grad students in my Estimation, Detection, and Filtering class this year; they did some great projects on data science and statistical inference and did very well in the class overall.

Speak Your Mind