Hanbaek Lyu, Deanna Needell, and I recently had a manuscript published at JMLR: “Online matrix factorization for Markovian data and applications to Network Dictionary Learning.” In this work we show that the well-known OMF algorithm for i.i.d. stream of data converges almost surely to the set of critical points of the expected loss function, even when the data stream is dependent but Markovian. It would be of great interest to show that this algorithm further converges to global minimizers, as has been recently proven for many batch-processing algorithms. We are excited about this important step, generalizing the theory for the more practical case where the data aren’t i.i.d. Han’s work applying this to network sampling is super cool — and in fact it’s impossible to sample a sparse network in an i.i.d. way, so this extension is critical for this application. The code is available here. Han is on the academic job market this year.
IAS Missing Data Workshop videos online
Bianca Dumitrascu, Boaz Nadler, and I hosted a virtual workshop in early September, supported by the Institute for Advanced Study. We had excellent speakers from across the spectrum of machine learning, statistics, and applications that consider missing data. You can find videos of all the seminars here.
In accordance with the order of the Ministry of Education and Science of Russia dated July 1, 2013 N 499 “On approval of the Procedure for organizing and carrying out educational activities for additional professional programs” (registered by the Ministry of Justice of the Russian Federation on August 20, 2013, registration N 29444)
Preference Learning with Salient Features
I am excited that Amanda Bower will have the opportunity to discuss our new work in preference learning, “Preference Modeling with Context-Dependent Salient Features“, at ICML next week. In this work, we propose a new model for preference learning that takes into account the fact that when making pairwise comparisons, certain features may play an outside role in the comparison, making the pairwise comparison result inconsistent with a general preference order. We look forward to hearing people’s questions and feedback! Update post-conference: Her presentation can be viewed here.
Online Tensor Completion and Tracking
Kyle Gilman and I have a preprint out describing a new algorithm for online tensor completion and tracking. We derive and demonstrate an algorithm that operates on streaming tensor data, such as hyperspectral video collected over time, or chemo-sensing experiments in space and time. Kyle presented his work at the first fully virtual ICASSP, which you can view here. Anyone can register for free to this year’s virtual ICASSP and watch the videos, post questions, and join the discussion. Kyle’s code is also available here. We think this algorithm will have a major impact in speeding up low-rank tensor processing, especially with time-varying data, and we welcome questions and feedback.
The speed of the girl’s movement around the city is about four kilometers per hour, or two and a half, if she wears heels higher than six centimeters. The zone of possible contact is five meters free hookup, no more. That is, everything about everything you get … How much do you get? (Damn, they told me for a reason: study math properly, you will need it!) Well, like, five seconds.
Education Excellence Award
I have been awarded an Education Excellence award from the U-M College of Engineering. This award recognizes faculty with a demonstrated sustained excellence in curricular development, instruction, and guidance at both the undergraduate and graduate levels. Students are the reason why I love having an academic job, so I am truly honored to have received this award.
Promotion to Associate Professor
I am thrilled that yesterday the Regents of the University of Michigan promoted me to Associate Professor with tenure, effective September 1. It feels like yesterday that I arrived in my faculty office for the first time, and yet it also feels long ago. I appreciate so much the support of my colleagues and mentors. I also would not be here without my outstanding, curious, enthusiastic, and hard-working graduate students — they are incredible. Here is the letter that my dean sent to the provost to recommend me for promotion. I am looking forward to the next phase!
Congratulations Dejiao and David!
SPADA lab is so proud of Dr. Dejiao Zhang and Dr. David Hong. They both successfully defended their PhD dissertations this spring. Dejiao is going to Amazon Web Services next, and David is going to a postdoc at the University of Pennsylvania. We expect you both to go off and do great things! Congratulations!
NSF CAREER Award
I am honored to have received the NSF CAREER award for a proposal on optimization methods and theory for the joint formulation of dimension reduction and clustering. You can read about the award here in the UM press release and also here on the NSF website. Dimension reduction and clustering are arguably the two most critical problems in unsupervised machine learning; they are used universally for data exploration and understanding. Often dimension reduction is used before clustering (or vice versa) to lend tractability to the modeling algorithm. It’s more typical in real data to see clusters each with their own low-dimensional structure, and so a joint formulation is of great interest. I look forward to working toward this end in the next stage of my career.
Army Young Investigator
I am very excited that my project “Mathematics for Learning Nonlinear Generalizations of Subspace Models in High Dimensions” has won the Army Young Investigator award! Subspace models are widely used due to simplicity and ease of analysis. However, while these linear models are very powerful in many high-dimensional data contexts, they also often miss out on important nonlinearities in real data. This project aims to extend recent advances in signal processing to the single-index model and the nonlinear variety model. Read the department’s announcement here.