EECS 551 F24 Class Topics - Jeff Fessler This list will be updated regularly (online) over the course of the semester. --- below here from F23, except dates are updated Class# topic @01 8/27 *** Course policies (0: c00pol.pdf) *** Intro to book (1: c01int.pdf) *** Intro to Matrices (2: c02mat.pdf) motivation notation 2.3 matrix shapes and classes (circulant, Toeplitz, ...) block matrices transpose, (Hermitian) symmetry 2.4 vector multiplication: inner/dot product, outer product @02 8/29 matrix-vector multiplication matrix multiplication: four forms, properties, array memory matrix versions of 2D plotting in Julia (Read) invertibility (Read) 2.5 orthogonal/orthonormal vectors DT discussion task: nearest angle classifier @03 9/03 Cauchy-Schwarz inequality orthogonal/unitary matrices unitary invariance of 2 norm 2.6 determinant / properties 2.7 eigenvalues via characteristic equation / properties @04 9/05 2.8 trace X demo: applying convolution to image denoising (moved to HW1) *** Matrix decompositions (3: c03eig.pdf) 3.2 eigendecomposition spectral theorem for Hermitian matrices geometric interpretation / eigshow1 demo 3.3 SVD geometric interpretation / eigshow2 demo basic properties and non-uniqueness HW1 due @05 9/10 3.4 matrix 2-norm as |Ax| maximization Application: maximize MIMO SNR (read) 3.5 relating SVD and eigendecomposition condition for equivalence of SVD and eigendecomposition 3.6 positive (semi)definite matrices @06 9/12 *** Subspaces and rank (4: c04sub.pdf) brief overview 4.2 subspaces definition span linear independence basis dimension subspace sum subspace intersection HW2 due @07 9/17 direct sum orthogonal complement linear transformations range 4.3 rank row rank = col rank rank bounds unitary invariance of rank vs eigenvalues and singular values @08 9/19 4.4 nullspace / nullity 4.5 four fundamental subspaces SVD "anatomy" 4.6 orthogonal bases 4.7 spot eig/SVD by hand for simple cases 4.8 Application: nearest-subspace classification projection onto a set HW3 due @09 9/24 projection onto a subspace practical implementation 4.9 Optimization preview convex sets convex functions properties nearest point in a subspace *** Linear equations and LS (5: c05ls.pdf) 5.2 Linear equations 5.3 Linear least-squares (LLS) estimation @10 09/26 normal equations backslash solution for full-rank case compact SVD solution uniqueness 5.4 Moore-Penrose pseudo-inverse product properties relation to SVD DT discussion task: subspace-based classifier HW4 due @11 10/01 LLS solution in full-rank case via pinv 5.5 LLS for under-determined case orthogonality principle minimum-norm LS via pseudo-inverse 5.6 Truncated SVD solution condition number low-rank interpretation noise effects (Read) Tikhonov regularization / ridge regression @12 10/03 5.7 Summary of LLS solutions using SVD 5.8 Frames / tight frames / Parseval tight frames (skip in f23) 5.9 Projection / idempotent matrix orthogonal projection matrix Projection onto convex sets (POCS) Projection onto a subspace SVD-based implementation Orthogonality principle for subspaces Projection onto a subspace's orthogonal complement Projectors and the four fundamental subspaces for a matrix Binary classifier using LS (read / in discussion) 5.10 recursive LS (skip) HW5 due DT discussion task: LS classifier todo: p296 boyd, Tikhonov? extra features? @13 10/08 *** Norms (6: c06norm.pdf) 6.2 Vector norms properties unitary invariance inner products properties, Cauchy-Schwarz, angle 6.3 Matrix norms defining properties, sub-multiplicative examples: Frobenius, l_p,q (Read) @14 10/10 induced matrix norms operator norms defined by singular values Julia implementation with opnorm Properties of matrix norms norm equivalence singular value inequalities [read] unitary invariance spectral radius practical step size for GD 6.4 Convergence of sequences of vectors & matrices (using norms) [Read] 6.5 Generalized inverse of a matrix / Frobenius norm [Read] 6.6 Procrustes analysis (SVD application using Frobenius norm) HW6 due (with sample exam problem) @xx 10/15 fall study break @15 10/17 SVD solution generalizations *** Low-rank approximation (7: c07lr.pdf) 7.2 Problem statement / solution / proof sketch Implementation: choosing rank via scree plot No HW due (due to fall break and midterm) @xx 10/21 Midterm on Ch1-6, HW1-6 (tentative coverage) @16 10/22 MRI coil compression application (read) Uniqueness 1D example cf LLS Generalization to other unitarily invariant norms Proof for spectral norm (Read) Bases for MxN matrices (Read) Summary / generalizations Stability of rank 7.3 Sensor localization application (multidimensional scaling) derivation practical implementation examples (demo) extensions @17 10/24 7.4 Proximal operators: l1 example l0 example 7.5 Alternative low-rank formulations: UI norm regularized cost functions rank regularizer: singular value hard thresholding (SVHT) nuclear norm: singular value soft thresholding (SVST) No HW due (due to fall break and midterm) @18 10/29 Frobenius norm squared as a rank regularizer 7.6 Choosing the rank or regularization parameter NRMSE vs NRMSD behavior SURE OptShrink optshrink demo 7.7 Relate LR to autoencoders (Read) relate to PCA (Read) 7.8 Subspace learning via low-rank approximation / SVD @19 10/31 Subspace clustering (Read) 7.9 Subspace tracking (read) *** Special matrices (8: c08spec.pdf) 8.2 Companion matrices minimal polynomial Vandermonde matrices Kronecker sum to find common polynomial roots (Read/HW) 8.3 Circulant matrices (cf DFT) G_N: companion matrix for z^N - 1 HW7 due @20 11/05 diagonalization of circulant matrices 8.4 Toeplitz matrices (Read) 8.5 power iteration (Read) Geršgorin disk theorem spectral radius bounds @21 11/07 8.6 nonnegative / positive / primitive matrix power test (read) weighted directed graphs reachability adjacency matrix Strongly connected graphs Irreducible matrix matrix period / aperiodic matrix 8.7 Perron-Frobenius theorems for nonnegative matrices HW8 due @22 11/12 primitive matrices irreducible matrices stochastic matrices 8.8 Markov chains (Optional reading in F23) transition matrix equilibrium distribution limiting distribution Markov chain simulation example notebook Google PageRank application 8.9 Graph Laplacian / spectral clustering (optional reading) 8.10 Summary Venn diagram [in discussion] clicker questions *** Optimization (9: c09opt.pdf) 9.4 GD for convex functions Lipschitz continuity convexity and Hessian @23 11/14 [f23 via f21 recording, due to hospitalization] convergence theorem for GD Gradient projection (GP) method 9.5 Machine learning for binary classification: logistic regression Hessian of training loss Lipschitz constant of training loss gradient (Read) example (Read) DT discussion task6: logistic regression classifier HW9 due @24 11/19 [f23 via f21 recording, per doctor's orders] 9.2 PGD iteration for LS Tool: Matrix square root Convergence rate analysis of PGD: first steps Tool: Matrix powers Classical GD: step size bounds Optimal step size for GD Practical step size for GD Ideal preconditioner for PGD @25 11/21 Tool: Positive (semi)definiteness properties General preconditioners for PGD Matrix majorizers Diagonal majorizer diag(|A'A|1) preconditioning demo (skip in f20, f23) Convergence rates Tool: commuting matrices (skip in f20, f23) Monotonicity of PGD (skip in f20, f23) 9.3 Preconditioned steepest descent (PSD) Read / [HW only] *** Matrix completion (10: c10mc.pdf) Low-rank matrix completion (LRMC) motivation 10.2 measurement model / sampling set / mask sampling conditions @26 11/26 10.3 noiseless case alternating projection method / Jupyter demo 10.4 noisy case [not covered in f20] rank-constrained formulation rank-regularized formulation convex regularized formulation with nuclear norm Majorize-minimize (MM) iterations @xx 11/28 thanksgiving day no HW due to break @27 12/03 MM methods (brief in f20, not in f23) iterative low-rank approximation (alt proj) ISTA iterative SVST (alternate fill / SVST) iterative SVHT FISTA for LRMC / SVST - jupyter demo ADMM for LRMC / SVST - jupyter demo ISTA derivation, extend GD to composite cost (skip) HW10 due @28 12/05 Course review / practice questions HW11 due (not in F23) @xx 12/?? (Mon.) final ?? todo: 1:30-3:30PM is official RO time