Sparsity and Non-Linearity

A great deal of work in sparse signal processing focuses on linear measurements. Applications, on the other hand, often have fundamental nonlinearities that need to be modeled. We have work studying algebraic variety models, monotonic nonlinear measurement functions, pairwise comparison or ranking data, and sparsity in deep networks.

Ongie, Greg, Daniel Pimentel-Alarcón, Laura Balzano, Rebecca Willett, and Robert D. Nowak. 2021. “Tensor Methods for Nonlinear Matrix Completion.” SIAM Journal on Mathematics of Data Science, January, 253–79. https://doi.org/10.1137/20M1323448.
Hong, David, Kyle Gilman, Laura Balzano, and Jeffrey A. Fessler. 2021. “HePPCAT: Probabilistic PCA for Data with Heteroscedastic Noise.” IEEE Transactions on Signal Processing, 1–1. https://doi.org/10.1109/TSP.2021.3104979.
Zhang, Dejiao, Haozhu Wang, Mario Figueiredo, and Laura Balzano. 2018. “Learning to Share: Simultaneous Parameter Tying and Sparsification in Deep Learning.” International Conference on Learning Representations (ICLR), April. https://openreview.net/forum?id=rypT3fb0b.
Ongie, Greg, Rebecca Willett, Robert D. Nowak, and Laura Balzano. 2017. “Algebraic Variety Models for High-Rank Matrix Completion.” In PMLR, 2691–2700. http://proceedings.mlr.press/v70/ongie17a.html.
Ganti, Ravi, Nikhil Rao, Laura Balzano, Rebecca Willett, and Robert Nowak. 2017. “On Learning High Dimensional Structured Single Index Models.” In Thirty-First AAAI Conference on Artificial Intelligence. https://www.aaai.org/ocs/index.php/AAAI/AAAI17/paper/view/14480.
Ganti, Ravi Sastry, Laura Balzano, and Rebecca Willett. 2015. “Matrix Completion Under Monotonic Single Index Models.” In Proceedings of the Conference for Advances in Neural Information Processing Systems, 1864–72. http://papers.nips.cc/paper/5916-matrix-completion-under-monotonic-single-index-models.