We have had many exciting publications in the last several months.
My student Dejiao Zhang and I worked with Mario Figueiredo and two other Michigan students on applying OWL regularization in deep networks. The intuition is that since OWL can tie correlated regressors, it should be able to do the same in deep nets that experience a high degree of co-adaptation (and correlation) of nodes in the network. Dejiao presented our paper Learning to Share: Simultaneous Parameter Tying and Sparsification for Deep Learning at ICLR last month and we will present Simultaneous Sparsity and Parameter Tying for Deep Learning using Ordered Weighted L1 Regularization at SSP next month.
With my colleague Johanna Mathieu and her student Greg Ledva, we published a paper in Transactions on Power Systems studying Real-Time Energy Disaggregation of a Distribution Feeder’s Demand Using Online Learning. The work leverages recent results in dynamic online learning where classes of dynamical models are used to apply online learning to the time-varying signal setting. This work can leverage existing sensing structure to improve prediction of distributed energy resources, demand-responsive electric loads and residential solar generation. We also have a book chapter in Energy Markets and Responsive Grids that was written also with my student Zhe Du.
Greg Ongie, David Hong, Dejiao Zhang, and I have been working on adaptive sampling for subspace estimation. If one has a matrix in memory that is large and difficult to access, but you want to compute a low-rank approximation of that matrix, one way is to sketch it by reading only parts of the matrix and computing an approximation. Our paper Enhanced Online Subspace Estimation Via Adaptive Sensing describes an adaptive sampling scheme to do exactly that, and using that scheme along with the GROUSE subspace estimation algorithm, we gave global convergence guarantees to the true underlying low-rank matrix. We will also present Online Estimation of Coherent Subspaces with Adaptive Sampling at SSP next month, which constrains the adaptive samples to be entry-wise and sees similar improvements.
Rounding it out, Zhe Du will be presenting our work with Necmiye Ozay on A Robust Algorithm for Online Switched System Identification at the SYS ID conference in July, and Bob Malinas and David Hong will present our work with Jeff Fessler on Learning Dictionary-Based Unions of Subspaces for Image Denoising at EUSIPCO in September. This spring Amanda Bower presented our work with Lalit Jain on The Landscape of Nonconvex Quadratic Feasibility, studying the minimizers for a non-convex formulation of the preference learning problem; and next week Naveen Murthy presents our work with Greg Ongie and Jeff Fessler on Memory-efficient Splitting Algorithms for Large-Scale Sparsity Regularized Optimization at the CT Meeting. Last fall Greg Ongie, Saket Dewangan, Jeff Fessler and I had a paper Online Dynamic MRI Reconstruction via Robust Subspace Tracking at GlobalSIP, pursuing the interesting idea of online subspace tracking for time-varying signals.
So many exciting research directions that we will continue to pursue!