I am excited that Amanda Bower will have the opportunity to discuss our new work in preference learning, “Preference Modeling with Context-Dependent Salient Features“, at ICML next week. In this work, we propose a new model for preference learning that takes into account the fact that when making pairwise comparisons, certain features may play an outside role in the comparison, making the pairwise comparison result inconsistent with a general preference order. We look forward to hearing people’s questions and feedback! Update post-conference: Her presentation can be viewed here.
Online Tensor Completion and Tracking
Kyle Gilman and I have a preprint out describing a new algorithm for online tensor completion and tracking. We derive and demonstrate an algorithm that operates on streaming tensor data, such as hyperspectral video collected over time, or chemo-sensing experiments in space and time. Kyle presented his work at the first fully virtual ICASSP, which you can view here. Anyone can register for free to this year’s virtual ICASSP and watch the videos, post questions, and join the discussion. Kyle’s code is also available here. We think this algorithm will have a major impact in speeding up low-rank tensor processing, especially with time-varying data, and we welcome questions and feedback.
The speed of the girl’s movement around the city is about four kilometers per hour, or two and a half, if she wears heels higher than six centimeters. The zone of possible contact is five meters free hookup, no more. That is, everything about everything you get … How much do you get? (Damn, they told me for a reason: study math properly, you will need it!) Well, like, five seconds.
Education Excellence Award
I have been awarded an Education Excellence award from the U-M College of Engineering. This award recognizes faculty with a demonstrated sustained excellence in curricular development, instruction, and guidance at both the undergraduate and graduate levels. Students are the reason why I love having an academic job, so I am truly honored to have received this award.
Promotion to Associate Professor
I am thrilled that yesterday the Regents of the University of Michigan promoted me to Associate Professor with tenure, effective September 1. It feels like yesterday that I arrived in my faculty office for the first time, and yet it also feels long ago. I appreciate so much the support of my colleagues and mentors. I also would not be here without my outstanding, curious, enthusiastic, and hard-working graduate students — they are incredible. Here is the letter that my dean sent to the provost to recommend me for promotion. I am looking forward to the next phase!
Congratulations Dejiao and David!
SPADA lab is so proud of Dr. Dejiao Zhang and Dr. David Hong. They both successfully defended their PhD dissertations this spring. Dejiao is going to Amazon Web Services next, and David is going to a postdoc at the University of Pennsylvania. We expect you both to go off and do great things! Congratulations!
NSF CAREER Award
I am honored to have received the NSF CAREER award for a proposal on optimization methods and theory for the joint formulation of dimension reduction and clustering. You can read about the award here in the UM press release and also here on the NSF website. Dimension reduction and clustering are arguably the two most critical problems in unsupervised machine learning; they are used universally for data exploration and understanding. Often dimension reduction is used before clustering (or vice versa) to lend tractability to the modeling algorithm. It’s more typical in real data to see clusters each with their own low-dimensional structure, and so a joint formulation is of great interest. I look forward to working toward this end in the next stage of my career.
Army Young Investigator
I am very excited that my project “Mathematics for Learning Nonlinear Generalizations of Subspace Models in High Dimensions” has won the Army Young Investigator award! Subspace models are widely used due to simplicity and ease of analysis. However, while these linear models are very powerful in many high-dimensional data contexts, they also often miss out on important nonlinearities in real data. This project aims to extend recent advances in signal processing to the single-index model and the nonlinear variety model. Read the department’s announcement here.
Postdoc Opportunity at the University of Michigan
Optimally Weighted PCA for High-dimensional Heteroscedastic Data
Today I had the opportunity to speak about very recent results by my student David Hong (joint work also with Jeff Fessler) in analyzing asymptotic recovery guarantees for weighted PCA for high-dimensional heteroscedastic data. In the paper we recently posted online, we have asymptotic analysis (as both the number of samples and dimension of the problem grow to infinity, but converge to a fixed constant) of the recovery for weighted PCA components, amplitudes, and scores. Those recovery expressions allow us to find weights that give optimal recovery, and the weights turn out to be a very simple expression involving only the noise variance and the PCA amplitudes. To learn more, watch my talk here, and let us know if you have any questions!
The problem with this method is that after the detailed plan for relocating your body to Dvortsovaya Street has been drawn up, we have nothing more to talk about and all the norms of public ethics demand that we part ways and part ways for good. So the wisest thing to do aff app is to figure out in which direction the beauty is pointing her feet, and try to tail her. Since you are on the road, let her take you to the second right turn towards the Georgian embassy.