Congratulations Dr. Bower!

Last fall, my PhD student Amanda Bower defended her thesis titled “Dealing with Intransitivity, Non-Convexity, and Algorithmic Bias in Preference Learning.” Amanda was in the Applied Interdisciplinary Math program, co-advised by Martin Strauss. She will now be moving on to work with Twitter’s ML Ethics, Transparency, and Accountability (META) group. We are so proud that she is going to go make her mark on the world. Congratulations Dr. Bower!

Online matrix factorization for Markovian data

Hanbaek Lyu, Deanna Needell, and I recently had a manuscript published at JMLR: “Online matrix factorization for Markovian data and applications to Network Dictionary Learning.” In this work we show that the well-known OMF algorithm for i.i.d. stream of data converges almost surely to the set of critical points of the expected loss function, even when the data stream is dependent but Markovian. It would be of great interest to show that this algorithm further converges to global minimizers, as has been recently proven for many batch-processing algorithms. We are excited about this important step, generalizing the theory for the more practical case where the data aren’t i.i.d. Han’s work applying this to network sampling is super cool — and in fact it’s impossible to sample a sparse network in an i.i.d. way, so this extension is critical for this application. The code is available here. Han is on the academic job market this year.

IAS Missing Data Workshop videos online

Bianca Dumitrascu, Boaz Nadler, and I hosted a virtual workshop in early September, supported by the Institute for Advanced Study. We had excellent speakers from across the spectrum of machine learning, statistics, and applications that consider missing data. You can find videos of all the seminars here.

Preference Learning with Salient Features

I am excited that Amanda Bower will have the opportunity to discuss our new work in preference learning, “Preference Modeling with Context-Dependent Salient Features“, at ICML next week. In this work, we propose a new model for preference learning that takes into account the fact that when making pairwise comparisons, certain features may play an outside role in the comparison, making the pairwise comparison result inconsistent with a general preference order. Her video and the schedule for her “poster session availability” can be found here. We look forward to hearing people’s questions and feedback!

Online Tensor Completion and Tracking

Kyle Gilman and I have a preprint out describing a new algorithm for online tensor completion and tracking. We derive and demonstrate an algorithm that operates on streaming tensor data, such as hyperspectral video collected over time, or chemo-sensing experiments in space and time. Kyle presented his work at the first fully virtual ICASSP, which you can view here. Anyone can register for free to this year’s virtual ICASSP and watch the videos, post questions, and join the discussion. Kyle’s code is also available here. We think this algorithm will have a major impact in speeding up low-rank tensor processing, especially with time-varying data, and we welcome questions and feedback.

Fulbright Award to study in Portugal

I am so excited to be awarded a Fulbright award to study optimization and machine learning with my colleague Mario Figueiredo in Lisbon, Portugal.

Two U-M stories about it can be found here and here. We will be working on new regularizers for matrix optimization problems.

Education Excellence Award

I have been awarded an Education Excellence award from the U-M College of Engineering. This award recognizes faculty with a demonstrated sustained excellence in curricular development, instruction, and guidance at both the undergraduate and graduate levels. Students are the reason why I love having an academic job, so I am truly honored to have received this award.

Promotion to Associate Professor

I am thrilled that yesterday the Regents of the University of Michigan promoted me to Associate Professor with tenure, effective September 1. It feels like yesterday that I arrived in my faculty office for the first time, and yet it also feels long ago. I appreciate so much the support of my colleagues and mentors. I also would not be here without my outstanding, curious, enthusiastic, and hard-working graduate students — they are incredible. Here is the letter that my dean sent to the provost to recommend me for promotion. I am looking forward to the next phase!

Congratulations Dejiao and David!

SPADA lab is so proud of Dr. Dejiao Zhang and Dr. David Hong. They both successfully defended their PhD dissertations this spring. Dejiao is going to Amazon Web Services next, and David is going to a postdoc at the University of Pennsylvania. We expect you both to go off and do great things! Congratulations!

NSF CAREER Award

I am honored to have received the NSF CAREER award for a proposal on optimization methods and theory for the joint formulation of dimension reduction and clustering. You can read about the award here in the UM press release and also here on the NSF website. Dimension reduction and clustering are arguably the two most critical problems in unsupervised machine learning; they are used universally for data exploration and understanding. Often dimension reduction is used before clustering (or vice versa) to lend tractability to the modeling algorithm. It’s more typical in real data to see clusters each with their own low-dimensional structure, and so a joint formulation is of great interest. I look forward to working toward this end in the next stage of my career.