Monotonic Matrix Completion

Ravi Ganti and Rebecca Willett and I had a paper in NIPS 2015 called “Matrix Completion under Monotonic Single Index Models.” We studied a matrix completion problem where a low-rank matrix is observed through a monotonic function applied to each entry. We developed a calibrated loss function that allowed a neat implementation and analysis. Now the code is available for public usage at this bitbucket link.

Variety Matrix Completion code

The code for our matrix completion algorithm from the ICML paper “Algebraic Variety Models for High-rank Matrix Completion” can be found here in Greg Ongie’s github repository. Using the algebraic variety as a low-dimensional model for data, Greg’s algorithm is a kernel method for doing matrix completion in the space associated with a polynomial kernel. It allows us to do matrix completion even when the matrix does not have low linear rank — but instead when it has low dimension in the form of a nonlinear variety. Start with the README for example scripts.

Distance-Penalized Active Learning Using Quantile Search

Active sampling — where one chooses what samples to collect based on data collected thus far — is an important approach for spatial environmental sampling, where resources are drastically limited when compared to the extent of the signals of interest. However, most active learning literature studies the case where each sample has equal cost. In spatial sampling, the sample cost is often proportional to distance between samples. John Lipor and I collaborated with our colleagues in the department of Civil and Environmental Engineering and the department of Natural Resources to develop active sampling techniques for lake sampling.

The code is available here. You can also find a video about the project here.

Lipor, J., B. P. Wong, D. Scavia, B. Kerkez, and L. Balzano. 2017. “Distance-Penalized Active Learning Using Quantile Search.” IEEE Transactions on Signal Processing 65 (20): 5453–65. https://doi.org/10.1109/TSP.2017.2731323.

Lipor, J., L. Balzano, B. Kerkez, and D. Scavia. 2015. “Quantile Search: A Distance-Penalized Active Learning Algorithm for Spatial Sampling.” In 2015 53rd Annual Allerton Conference on Communication, Control, and Computing (Allerton), 1241–48. https://doi.org/10.1109/ALLERTON.2015.7447150.

SFM code

Ryan Kennedy put our SAGE GROUSE code for Structure from Motion online. We’ve also updated the preprint of our paper on this topic. Please let us know if you have any questions!

Robust Blind Calibration

My student John Lipor and I have had our work on robust blind calibration published in ICASSP. Previous work showed that it is possible to blindly calibrate sensor gains only knowing the signal subspace. This new algorithm allows you to calibrate sensor gains even when your knowledge of the signal subspace is inaccurate. John has shared his code for robust blind calibration here.

t-GRASTA

Jun He, Dejiao Zhang, and I are happy to share our preprint and our code on t-GRASTA, a variant of GRASTA which includes estimation of geometric transformations of the data (such as translations and rotations of images, so that we can deal with jitter).

GRASTAcam

The GRASTAcam code is availabile for download here. This code uses OpenCV to run Grasta using the camera on your computer. It was written by Arthur Szlam (with makefile by Jia Xu, Thanks Jia!) in C using the Intel MKL library.

GRASTA

Grassmannian Robust Adaptive Subspace Tracking Algorithm (GRASTA), is an algorithm for subspace identification and tracking in the presence of corrupted and missing data. The algorithm is derived using the Augmented Lagrangian for an l1 cost function. See our page on GRASTA for code and a description of the algorithm.

screenshot6

GROUSE

Grassmannian Rank-One Update Subspace Estimation (GROUSE), is an online algorithm for subspace identification and tracking when data vectors are incomplete. See our page on GROUSE for details about the algorithm. GROUSE can be used for matrix completion; download grouse.m for the Matlab function and rungrouse.m to see how to run it.

Generating all rooted trees

Matlab code to enumerate all possible trees on n nodes: I wrote this awhile ago when I needed to enumerate all possible trees that could be formed by a set of n nodes. There are n^(n-1) such trees. This code uses the fact that there is a bijection between these trees and Prufer sequences. Here is tree_enumeration.m, which is a skeleton file for running code on all possible trees, and here is the function for generating the next tree, get_next_tree.m.