Loading [MathJax]/jax/element/mml/optable/BasicLatin.js

CMDA 3606: Mathematical Modeling: Tools and Techniques II

Virginia Tech · Spring 2019

CMDA 3606 is a course devoted to variations on the equation Ax=b. Not only do we see how such equations arise from static equilibrium problems in circuits and structures; we will also consider a variety of inverse problems, for example, associated with image deblurring. Course topics include modeling linear systems; the singular value decomposition; matrix approximation, PCA, and recommender systems; regularized least squares probelms and imaging applications; solving large-scale linear systems via Krylov methods; unconstrained nonlinear optimization.

CRN 12561      

Lectures: Monday/Wednesday, 2:30-3:45pm, McBryde 230
Instructor: Mark Embree, embree@vt.edu, 575 McBryde
Office Hours:   Monday 4-6pm; Wednesday 1-2pm; Thursday 1-2pm

CRN 12562      

Lectures: Tuesday/Thursday, 2:00-3:15pm, McBryde 240
Instructor: Christopher Beattie, beattie@vt.edu, 552 McBryde
Office Hours:   Tuesday 4-6pm, Wednesday 4-6pm

Piazza:   

The course Piazza page provides a forum for class discussions (both sections).

Lecture 29

Overview for CMDA 3606:


Lecture 28

Line search methods and step length conditions.
Stochastic gradient descent.
Reading: Chapter 3 of Nocedal and Wright, Numerical Optimization

Lecture 27

Necessary and sufficient conditions for optimality.
Gradient descent: xk+1=xkαkϕ(xk).
Reading: Chapter 2 of Nocedal and Wright, Numerical Optimization

Lecture 26

Preconditioning: Solve (M1AM2)(M12x)=M1b instead of Ax=b.
Optimization intro: Taylor's Theorem in n-dimensions.
Basic line search methods and Newton's method. Reading: Chapter 2 of Nocedal and Wright, Numerical Optimization

Lecture 25

More details about GMRES, and basic convergence theory.
Krylov methods for regularization.
Preconditioning: Solve (M1AM2)(M12x)=M1b instead of Ax=b.
Reading: course notes, Chapter 8: Iterations for Large Linear Systems (draft)

Lecture 24

Introduction to the GMRES algorithm.
Reading: course notes, Chapter 8: Iterations for Large Linear Systems (draft)

Lecture 23

Introduction to the solution of large Ax=b problems
Richardson's method: xk+1=xk+crk
Lewis Fry Richardson (Wikipedia)
Reading: course notes, Chapter 8: Iterations for Large Linear Systems (draft)

Lecture 22

Deblurring in two dimensions.
Reading: course notes, Chapter 7: Inverse Problems and Regularization.

Lecture 21

PCA and clustering.
Review for Midterm 2 (Thursday night).
Reading: course notes, Chapter 7: Inverse Problems and Regularization.

Lecture 20

Principal Component Analysis via the SVD
Reading: course notes, Chapter 6: The Singular Value Decomposition.

Lecture 19

Using the L-curve to choose a regularization parameter.
Statistics of least squares and regularization.
Reading: course notes, Chapter 7: Inverse Problems and Regularization.

Lecture 18

Regularization via truncated SVD and Tikhonov regularization.
You can solve the Tikhonov regularization problem as a standard least squares problem by augmenting the matrix A.
Reading: course notes, Chapter 7: Inverse Problems and Regularization.

Lecture 17

When A has small singular values, the pseudoinverse solution x=A+b can have large x.
When rank(A)<min, small changes to A can change the pseudoinverse significantly.
Reading: course notes, Chapter 7: Inverse Problems and Regularization.

Lecture 16

Solving all variety of Ax=b problems using the SVD.
The matrix A = \sum_{j=1}^r \sigma_j u_j v_j^T has pseudoinverse A^+ = \sum_{j=1}^r {1\over \sigma_j} v_j u_j^T.
Reading: course notes, Chapter 7: Inverse Problems and Regularization.

Lecture 15

Low-rank matrix approximation
Given the dyadic form of the SVD A = \sum_{j=1}^r \sigma_j u_j v_j^T, the best rank-k approximation to A is A_k = \sum_{j=1}^k \sigma_j u_j v_j^T.
Illustration: image compression with low-rank matrices.
Reading: course notes, Chapter 6: The Singular Value Decomposition.

Lecture 14

The SVD three ways:
The compact SVD, the full SVD, and the dyadic form of the SVD.
Reading: course notes, Chapter 6: The Singular Value Decomposition.

Lecture 13

Construction of the Singular Value Decomposition (SVD).
Reading: course notes, Chapter 6: The Singular Value Decomposition.

Lecture 12

Eigenvalues and eigenvectors of symmetric matrices.
The eigenvalues of a symmetric matrix are real.
The eigenvectors of a symmetric matrix associated with distinct eigenvalues are orthogonal.
Exam 1 returned and discussed.
Reading: course notes, Chapter 6: The Singular Value Decomposition.

Lecture 11

Review first third of the course in advance of Exam 1.
Exam 1 will be held on Thursday night (28 Feb), 7-10pm.
(MW Section: McBryde 113; TR Section: McBryde 129).
Take the exam during the most convenient 100 (contiguous) minutes in 7-10pm.

Lecture 10

Solve Least Squares problems using QR factorization
The Normal Equations A^TAx = A^Tb always have a solution.
If A is full rank, the pseudoinverse A^+ = (A^TA)^{-1}A^T gives the LS solution x = A^+b,
and the matrix AA^+ is a projector onto {\cal R}(A).
Given A=QR, one can solve the least squares problem via the (small!) systemn Rx = Q^Tb.
Reading: course notes, Chapter 5: Orthogonality.

Lecture 9

Gram-Schmidt orthogonalization and the QR decomposition
The Gram-Schmidt process can be summarized as A = QR, where Q^TQ=I and R is upper triangular.
Reading: course notes, Chapter 5: Orthogonality.

Lecture 8

Orthogonality and projectors
A square matrix P is a projector provided P^2=P.
A projector that is also symmetric (P^T=P) is called an orthogonal projector.
If v is a nonzero vector, then P = vv^T/v^Tv is an orthogonal projector onto {\rm span}\{v\}.
Reading: course notes, Chapter 5: Orthogonality.

Lecture 7

Introduction to least squares problems and their solution
The solution to the least squares problem satisfies the Normal Equations A^TAx=A^Tb.
Note that {\cal N}(A^TA) = {\cal N}(A): hence the Normal Equations have a solution if {\cal N}(A)=\{0\}.
Reading: course notes, Chapter 4: Fundamentals of Subspaces.
Reading: course notes, Chapter 5: Orthogonality.

Lecture 6

Fundamental Theorem of Linear Algebra
{\cal R}(A)\oplus {\cal N}(A^T) = \mathbb{R}^m and {\cal R}(A^T)\oplus {\cal N}(A) = \mathbb{R}^n.
Reading: course notes, Chapter 4: Fundamentals of Subspaces.

Lecture 5

Fundamentals of subspaces
Definition of subspace; the null space {\cal N}(A) and range {\cal R}(A) are subspaces.
MATLAB codes: plotline2.m, plotplane2.m, plotline3.m, plotplane3.m.
Reading: course notes, Chapter 4: Fundamentals of Subspaces.

Lecture 4

Modeling two dimensional structures with oblique supports.
Reading: course notes, Chapter 3: Simple Structures at Equilibrium.

Lecture 3

Modeling two dimensional structures.
Using linear approximations of the elongation, again obtain A^TKAx=f.
Reading: course notes, Chapter 3: Simple Structures at Equilibrium.

Lecture 2

Introduction to modeling simple (static) structures.
Different physics, same equation as the circuit model : A^TKAx = f
Reading: course notes, Chapter 3: Simple Structures at Equilibrium.

Lecture 1

Review of the syllabus and discussion of notes/book options.
Be sure to note the evening midterm times: 28 February and 11 April from 7-10pm. (Exams last 100 minutes each.)
Reading: course notes, Chapter 1: Introduction.
Reading: course notes, Chapter 2: Linear Systems from Resistor Networks (draft).
Introduction to circuit modeling: A^TKAx = A^T K v

Problem Set 11

Posted 24 April 2019. Due 2 May 2019.
hw11.pdf: assignment

Problem Set 10

Posted 17 April 2019. Due 25 April 2019.
hw10.pdf: assignment

Problem Set 9

Posted 12 April 2019. Due 18 April 2019.
hw9.pdf: assignment
MATLAB codes: blur2d.m, hokiebird.mat, mystery_plate.mat.

Problem Set 8

Posted 27 March 2019. Due 4 April 2019.
hw8.pdf: assignment
MATLAB codes: coke_upc.m, mystery_upc.p.

Problem Set 7

Posted 20 March 2019. Due 28 March 2019.
hw7.pdf: assignment

Problem Set 6

Posted 13 March 2019. Due 21 March 2019.
hw6.pdf: assignment
MATLAB codes: hello.m, cow.mat, planck.mat.

Problem Set 5

Posted 1 March 2019. Due 7 March 2019.
hw5.pdf: assignment
MATLAB code: hw5.mat.

Problem Set 4

Posted 13 February 2019. Due 21 February 2019.
hw4.pdf: assignment
MATLAB codes: bridge.p, bridge_A.mat.

Problem Set 3

Posted 7 February 2019. Due 14 February 2019.
hw3.pdf: assignment
MATLAB codes: plotline2.m, plotplane2.m, plotline3.m, plotplane3.m.

Problem Set 2

Posted 30 January 2019. Due 7 February 2019.
hw2.pdf: assignment

Problem Set 1

Posted 23 January 2019. Due 31 January 2019.
hw1.pdf: assignment


Full Syllabus

Download a copy of the electronic version of the syllabus and tentative schedule.

Grade Policy

Final course grades will be thus allocated:
     40%: problem sets
     40%: midterm exams (evenings of 28 February and 11 April, 7-10pm)
     20%: final exam (13 May: 2:05-4:05pm (MW section); 4:25-6:25pm (TR section)

Honor Code

Virginia Tech's Honor Code applies to all work in this course. Students must uphold the highest ethical standards, abiding by our Honor Code: "As a Hokie, I will conduct myself with honor and integrity at all times. I will not lie, cheat, or steal, nor will I accept the actions of those who do." From the Office for Undergraduate Academic Integrity: "Students enrolled in this course are responsible for abiding by the Honor Code. A student who has doubts about how the Honor Code applies to any assignment is responsible for obtaining specific guidance from the course instructor before submitting the assignment for evaluation. Ignorance of the rules does not exclude any member of the University community from the requirements and expectations of the Honor Code. For additional information about the Honor Code, please visit: www.honorsystem.vt.edu."

Text Books

We will not closely follow any one textbook; the instructors will provide notes for much of the course. The following books, all freely available online to Virginia Tech students, provide helpful background information.

Accommodations

Any student with special needs or circumstances requiring accommodation in this course is encouraged to contact the instructor during the first week of class, as well as Virginia Tech's SSD Office. We will ensure that these needs are appropriately addressed.