Variety Matrix Completion code

The code for our matrix completion algorithm from the ICML paper “Algebraic Variety Models for High-rank Matrix Completion” can be found here in Greg Ongie’s github repository. Using the algebraic variety as a low-dimensional model for data, Greg’s algorithm is a kernel method for doing matrix completion in the space associated with a polynomial kernel. It allows us to do matrix completion even when the matrix does not have low linear rank — but instead when it has low dimension in the form of a nonlinear variety. Start with the README for example scripts.

Congratulations John!

Congratulations to Dr. John Lipor for successfully defending his PhD thesis in September! The title of his work is “Sensing Structured Signals with Active and Ensemble Methods.” In January he will start as Assistant Professor in the Portland State University ECE Department.

Congratulations David!

David Hong was awarded the Richard and Eleanor Towner Prize for Outstanding PhD Research at the Michigan Engineering Graduate Symposium. This is a prize awarded annually across the entire college of engineering to PhD students within about a year of graduation, and the criteria for selection are creativity, innovation, impact on society, and achievement. Congratulations David!

Data-Driven Discovery of Models

Jason Corso and I have been awarded a DARPA D3M grant. Our project is called SPIDER: Subspace Primitives that are Interpretable and DivERse. We will be contributing machine learning software primitives for a system that helps domain experts perform a wide variety of automated data analysis on their datasets. The project has 24 teams and is already off to a great start — We look forward to final system developed by this team over the next few years!

Distance-Penalized Active Learning Using Quantile Search

Active sampling — where one chooses what samples to collect based on data collected thus far — is an important approach for spatial environmental sampling, where resources are drastically limited when compared to the extent of the signals of interest. However, most active learning literature studies the case where each sample has equal cost. In spatial sampling, the sample cost is often proportional to distance between samples. John Lipor and I collaborated with our colleagues in the department of Civil and Environmental Engineering and the department of Natural Resources to develop active sampling techniques for lake sampling.

The code is available here. You can also find a video about the project here.

Lipor, J., B. P. Wong, D. Scavia, B. Kerkez, and L. Balzano. 2017. “Distance-Penalized Active Learning Using Quantile Search.” IEEE Transactions on Signal Processing 65 (20): 5453–65.

Lipor, J., L. Balzano, B. Kerkez, and D. Scavia. 2015. “Quantile Search: A Distance-Penalized Active Learning Algorithm for Spatial Sampling.” In 2015 53rd Annual Allerton Conference on Communication, Control, and Computing (Allerton), 1241–48.

ICML Acceptances, SIAM OPT success

Congratulations to postdoc Greg Ongie, whose excellent work on Variety Models for Matrix Completion has been accepted to ICML. We’re very excited about the potential applications and open problems that we posed in this work. Congratulations also to John Lipor, whose work on Active Subspace Clustering has also been accepted to ICML — I’ve spoken about this work before at the Simons Institute workshop on Interactive Learning. It achieves state of the art clustering error on several benchmark datasets using very few pairwise cluster queries.

We also just finished a week at the SIAM Optimization conference, where our mini-symposium on Non-convex Optimization in Data Analysis was a huge hit. We had a full room for each session and 12 outstanding talks. Thanks to my co-organizers Stephen Wright, Rebecca Willett, and Rob Nowak, and thanks to all the speakers and participants.

MIDAS seminar and new results

Last Friday I gave the MIDAS weekly seminar. You can find the description here, along with the link directly to the recording. I talked about two recent problems I have been working on: First I talked about my work with Ravi Ganti and Rebecca Willett on learning a low-rank matrix that is observed through a monotonic function from partial measurements. This is common in calibration and quantization problems. Follow up work with Nikhil Rao and Rob Nowak in addition generalized this to learning structured single index models. Second, I talked about the work of my student David Hong, co-advised by Jeff Fessler, on the asymptotic performance of PCA with heteroscedastic data. This is common in problems like sensor networks or medical imaging, where different measurements of the same phenomenon are taken with different quality sensing (eg high or low radiation). David has recently posted his paper on arxiv showing predictions of the asymptotic performance; exploiting the structure of these expressions we also showed that asymptotic recovery for a fixed average noise variance is maximized when the noise variances are equal (i.e., when the noise is in fact homoscedastic). Average noise variance is often a practically convenient measure for the overall quality of data, but our results show that it gives an overly optimistic estimate of the performance of PCA for heteroscedastic data.

AAAI and Simons

This week I head to the bay area for two presentations: My collaborator Ravi Ganti will be presenting our work on estimating high dimensional structured single index models.  The week following I’ll be at the Simons workshop on Interactive Learning presenting my work with student John Lipor on active labeling from union of subspace data. I’m looking forward to sharing our work and hearing about all the other interesting work going on in these areas!

Fall 2016 DSP Projects

The link to all the awesome projects in my fall DSP class is up — Every year the projects get more creative and more technically interesting as well. Great work everyone!

Student accomplishments

Congratulations to John Lipor for passing his proposal defense!

Congratulations to David Hong for winning both the session award for (SIC) Signal and Image Processing, Computer Vision at the University of Michigan Engineering Graduate Symposium and the award for Most Interesting Methodological Advancement in the Michigan MIDAS Symposium for his work on heteroscedastic PCA.

Congratulations to Chenlan Wang for winning the Rackham International Student Fellowship.

Nice work team!