Three papers online

My students and I have posted three very exciting papers in the last few months!

Towards a Theoretical Analysis of PCA for Heteroscedastic Data” with David Hong and Jeff Fessler studies the behavior of PCA with data points that have different noise variances. We published that work at the Allerton conference.

Convergence of a Grassmannian Gradient Descent Algorithm for Subspace Estimation From Undersampled Data” with Dejiao Zhang proves expected convergence rates for incremental gradient subspace estimation constrained to the Grassmannian. Despite this being a non-convex problem, our results show locally linear convergence rates when the observed vectors have either missing data or compressively sampled data.

Leveraging Union of Subspace Structure to Improve Constrained Clustering” with John Lipor demonstrates how the assumption of subspace structure can dramatically help active clustering algorithms. By querying users intelligently for pairwise clustering constraints, we drive clustering error on several benchmark datasets down near zero.