MIDAS seminar and new results

Last Friday I gave the MIDAS weekly seminar. You can find the description here, along with the link directly to the recording. I talked about two recent problems I have been working on: First I talked about my work with Ravi Ganti and Rebecca Willett on learning a low-rank matrix that is observed through a monotonic function from partial measurements. This is common in calibration and quantization problems. Follow up work with Nikhil Rao and Rob Nowak in addition generalized this to learning structured single index models. Second, I talked about the work of my student David Hong, co-advised by Jeff Fessler, on the asymptotic performance of PCA with heteroscedastic data. This is common in problems like sensor networks or medical imaging, where different measurements of the same phenomenon are taken with different quality sensing (eg high or low radiation). David has recently posted his paper on arxiv showing predictions of the asymptotic performance; exploiting the structure of these expressions we also showed that asymptotic recovery for a fixed average noise variance is maximized when the noise variances are equal (i.e., when the noise is in fact homoscedastic). Average noise variance is often a practically convenient measure for the overall quality of data, but our results show that it gives an overly optimistic estimate of the performance of PCA for heteroscedastic data.

AAAI and Simons

This week I head to the bay area for two presentations: My collaborator Ravi Ganti will be presenting our work on estimating high dimensional structured single index models.  The week following I’ll be at the Simons workshop on Interactive Learning presenting my work with student John Lipor on active labeling from union of subspace data. I’m looking forward to sharing our work and hearing about all the other interesting work going on in these areas! (Edit: See here for my Simons talk on active labeling for union of subspace data.)

Fall 2016 DSP Projects

The link to all the awesome projects in my fall DSP class is up — Every year the projects get more creative and more technically interesting as well. Great work everyone!

Student accomplishments

Congratulations to John Lipor for passing his proposal defense!

Congratulations to David Hong for winning both the session award for (SIC) Signal and Image Processing, Computer Vision at the University of Michigan Engineering Graduate Symposium and the award for Most Interesting Methodological Advancement in the Michigan MIDAS Symposium for his work on heteroscedastic PCA.

Congratulations to Chenlan Wang for winning the Rackham International Student Fellowship.

Nice work team!

 

Three papers online

My students and I have posted three very exciting papers in the last few months!

Towards a Theoretical Analysis of PCA for Heteroscedastic Data” with David Hong and Jeff Fessler studies the behavior of PCA with data points that have different noise variances. We published that work at the Allerton conference.

Convergence of a Grassmannian Gradient Descent Algorithm for Subspace Estimation From Undersampled Data” with Dejiao Zhang proves expected convergence rates for incremental gradient subspace estimation constrained to the Grassmannian. Despite this being a non-convex problem, our results show locally linear convergence rates when the observed vectors have either missing data or compressively sampled data.

Leveraging Union of Subspace Structure to Improve Constrained Clustering” with John Lipor demonstrates how the assumption of subspace structure can dramatically help active clustering algorithms. By querying users intelligently for pairwise clustering constraints, we drive clustering error on several benchmark datasets down near zero.

Productive spring

This spring I’ve had a couple opportunities to share work with Ravi Sastry Ganti and Rebecca Willett on Matrix completion under monotonic single index model observations that we had at NIPS last December. In a week I’ll be at AI stats with my student Dejiao Zhang presenting our work on Global convergence of a Grassmannian gradient descent algorithm for subspace learning — we now have a proof that the GROUSE subspace learning algorithm converges globally from any random initialization to the global minimizer, or to a ball around the generating subspace in the case of noisy data. And last but never least, I had a great set of grad students in my Estimation, Detection, and Filtering class this year; they did some great projects on data science and statistical inference and did very well in the class overall.

Att det manliga organet inte lyckas få erektion nog för att tillfredsställa sin partner kallas för manlig erektil dysfunktion. Nu när billigt viagra är tillgängligt mycket lätt kan du köpa det och njuta av ditt sexliv igen, kolla Bästapiller.se/viagra/.

Fall 2015 DSP Projects

The link to all the awesome projects in my fall DSP class is up — Once again the students impressed me with their creative ideas and excitement about advanced topics in signal processing. Great work everyone!

Computational Advances in Multi-Sensor Adaptive Processing

I just returned from a great CAMSAP meeting, where I learned about a probabilistic theory of deep learning, new work in tensor completion, and an importance sampling strategy for non-convex block coordinate descent. My student John and I also presented our new work on active labeling in the subspace clustering problem, where we showed that a small amount of label information helps tremendously in clustering noisy data when each cluster lies near a low-dimensional subspace. Thanks to the organizers for putting together such a great meeting!

Intel Early Career Faculty Honor Program

I am honored to have received the Intel Early Career Faculty Honor Program Award for my research in big data. The purpose of the program is to help Intel connect with early career faculty members who show promise as future academic leaders in disruptive computing technologies.

DSP undergraduate projects highlighted in EECS news

With every new fall semester comes the excitement of teaching EECS 351, Digital Signal Processing. This week the EECS news highlighted my initiative to incorporate data collection and analysis with DSP into 351 with a course project. It’s an initiative I began two years ago and will continue for as long as I teach DSP. They mentioned a few of the cool projects students have done, and there are many more at the links. I’m looking forward to seeing what the students will do in my class this fall!