Difference between revisions of "Math 510: Numerical Methods for Linear Algebra"

From MathWiki
Jump to: navigation, search
(Minimal learning outcomes)
(Catalog Information)
 
(13 intermediate revisions by 3 users not shown)
Line 6: Line 6:
 
=== Credit Hours ===
 
=== Credit Hours ===
 
3
 
3
 +
 +
=== Offered ===
 +
F
  
 
=== Prerequisite ===
 
=== Prerequisite ===
[[Math 343]], [[Math 410|410]]; or equivalents.
+
[[Math 314]], CS 142; or equivalent.
  
 
=== Description ===
 
=== Description ===
Line 15: Line 18:
  
 
== Desired Learning Outcomes ==
 
== Desired Learning Outcomes ==
 +
 +
This course is designed to prepare students to solve linear algebra
 +
problems arising from many applications such as mathematical models of physical or engineering processes.
 +
Students are introduced to modern concepts and methodologies in numerical linear algebra, with particular emphasis on t
 +
methods that can be used to solve very large scale problems.  In depth discussion of
 +
theoretical aspects such as stability  and convergence
 +
will be used to enhance the students' understanding of the
 +
numerical methods.  Students will also be required to perform some
 +
programming and computation so as to gain experience in
 +
implementing and  observing the numerical
 +
performance of the various numerical methods.
 +
 +
The course addresses the University goal of developing the skills
 +
of sound thinking, effective communication and quantitative
 +
reasoning.  The course also allow students, especially
 +
undergraduate students,  to develop some depth and consequently
 +
competence in an important area of applied mathematics.
 +
 +
This course requires knowledge of higher level courses in
 +
mathematics.  The course also serves as an introductory graduate
 +
level course to prepare the students to apply the methods learned
 +
in their research projects.
  
 
=== Prerequisites ===
 
=== Prerequisites ===
Line 25: Line 50:
 
(Must know)
 
(Must know)
  
Know properties of unitary matrices
+
Know properties of unitary matrices
Know general definition of norms and specific definition of vector $p$-norms for 1<=p< infinity
+
 
Know general definition of induced matrix norms and specific definitions of 1-,2- and infinity-norms
+
Know general and practical definitions and properties of norms
Know  norm of product of matrices  $\leq $  product of norms of each matrix
+
 
Know invariance of 2-norm and Frobenius norm of a matrix under multiplication by a unitary matrix
+
Know definition and properties of SVD  
Know definition of singular values and left and right singular vectors
+
 
State the  existence and uniqueness of SVD  
+
Know definitions of and properties of projectors and orthogonal projectors
Be able to find 2-norm and Frobenius norm given singular values of a matrix 
+
 
Be able to represent a matrix  as a sum of rank-one matrices 
+
Be able to state and apply  classical Gram-Schmidt and modified Gram-Schmidt algorithms
Know truncated rank-one sum gives the best lower rank approximation to a matrix 
+
 
Know definitions of projectors,  complementary projectors and orthogonal projectors
+
Know definition and properties of a Householder reflector
be able to state and apply  classical Gram-Schmidt and modified Gram-Schmidt algorithms
+
 
Know definition and properties of a Householder reflector
+
Know how to construct Householder QR factorization  
Know how to construct Householder reflectors to triangularize a matrix
+
 
Know the Householder QR factorization for triangularizing a matrix
+
Know how least squares problems arise from a polynomial fitting problem
Know how least squares problems arise from a polynomial fitting problem
+
 
Know how to solve least square problems using normal equations/pseudoinverse , QR factorization and SVD
+
Know how to solve least square problems using  
Be able to define the condition of a problem and related condition number
+
  (1) normal equations/pseudoinverse,  
Know how to calculate the condition number of a matrix (using norms and using singular values)
+
  (2) QR factorization and  
Understand the concepts of well-conditioned and ill-conditioned problems
+
  (3) SVD
  Know how to derive the conditioning bound of matrix-vector multiplication/ linear system solve
+
 
Be able to state the precise definition of stability and backward stability
+
Be able to define the condition of a problem and related condition number
Know the difference between stability and conditioning
+
Know how to calculate the condition number of a matrix  
Know the four condition numbers of a least squares problem   
+
 
Know how to construct element matrices to represent row operations
+
Understand the concepts of well-conditioned and ill-conditioned problems
Understand LU  and PLU factorizations
+
 
Know how permutation matrices are used to represent row and column swap
+
Know how to derive the conditioning bounds
Understand how pivots are selected in partial pivoting
+
 
Know how PLU is represented by permutation matrices and elementary matrices 
+
Know the precise definition of stability and backward stability
  Know similarity transformation preserves eigenvalues
+
 
Know nondefective matrices have an eigenvalue decomposition (i.e. diagonalizable)
+
Be able to apply the fundamental axiom of floating point arithmetic to determine stability
Understand why and how matrices can be reduced to Hessberg form (or for real symmetric matrices, to tridiagonal form) using Householder reflectors while preserving eigenvalues
+
 
Be able to state and prove the convergence of power method
+
Know the difference between stability and conditioning
  Know what Rayleigh quotient is nd its relation to eigenvalues
+
 
Be able to describe inverse iteration and Rayleigh quotient iteration
+
Know the four condition numbers of a least squares problem   
Know the convergence rates for eigenvalue/vector of  Rayleigh quotient iteration
+
 
Understand Rayleigh quotient as stationary points of $r(x)$
+
Know how to construct LU  and PLU factorizations
Know what normal matrices are
+
 
Be able to state the QR algorithm (without and with shifts) and now why shifts are needed
+
Know how PLU is related to Gaussian elimination
Know how to find Rayleigh quotient shifts and Wilkinson shifts
+
 
Know how to find Rayleigh quotient shifts
+
Understand Cholesky decomposition
Be able to state the simultaneous iteration algorithm
+
 
Understand simultaneous iteration and QR alogirthm are mathematically equivalent   
+
Know properties of eigenvalues and eigenvectors under similarity transformation and shift
  Be able to describe Arnoldi algorithm
+
 
Know reduced QR factors of Krylov matrix obtained from Arnoldi iterations 
+
Know various matrix decomposition related to eigenvalue calculation:
Know characteristic polynomial of $H_{n}$ in Arnoldi iteration satisfies a minimization problem 
+
  (1) spectral decomposition
Be able to state the GMRES algorithm
+
  (2) unitary diagonaliation
Be able to describe the leas squares problem that needs to be solved in a GMRES iteration
+
  (2) Schur decomposition
Be able to state three term recurrence of Lanczos iteration for real symmetric matrices
+
Understand why and how matrices can be reduced to Hessenberg form
State the CG algorithm for a real SPD matrix
+
 
Be able to derive the step length parameter in line search and construction of new search directions in CG
+
Know various form of power method and what Rayleigh quotient iteration and their properties and convergence rates
Be about the describe the v-cycle multigrid algorithm and the full multigrid algorithm
+
 
Understand  relaxation, nested multiplication and coarse grid correction
+
Know the QR algorithm with shifts  
Understand how information is transferred between uniform grids using prolongation and restriction for 1D problems
+
 
 +
Understand simultaneous iteration and QR algorithm are mathematically equivalent   
 +
 
 +
Understand the Arnoldi algorithm and its properties
 +
 
 +
Know the polynomial approximation problem associated with Arnoldi method
  
(Important to know)
+
Be able to state the GMRES algorithm
  
Know  inner product and outer product
+
Be able to state three term recurrence of Lanczos iteration for real symmetric matrices
Know Cauchy-Schwarz and Holder inequalities
+
Know difference between reduced and full SVD
+
Know relation between singular vectors and eigenvectors for Hermitian $A$
+
Know relation between determinant of a square matrix and its singular values
+
Know relationship between mathematical/ numerical rank  and singular values 
+
Know relation among range, null space  and singular vectors 
+
Know characterization of orthogonal projector 
+
Know  reduced and full QR factorizations
+
Know existence of QR factorization 
+
Be able to express classical Gram-Schmidt and modified Gram-Schmidt algorithms in terms of projectors
+
Write Gram-Schmidt as a QR factorization
+
Know classical Gram-Schmidt algorithm is unstable (loss of orthogonality) but  modified Gram-Schmidt algorithm is stable
+
Know solution of least square problem has its residual orthogonal to the range of  matrix 
+
Know pseudo-inverse and its relation to least squares problem
+
Be able to state the fundamental axiom of floating point arithmetic
+
Understand the big Oh notation
+
Know that QR factorization via Householder factorization is backward stable
+
Be able to derive the conditioning bounds for a least squares problem when the vector $b$ is perturbed. 
+
Know solution using SVD and QR factorization  (both via Householder or Gram-Schmidt)  is backward stable
+
Know solution to normal equations  is unstable
+
Know how PLU factorization is implemented
+
Understand Cholesky decomposition via  LDL transposed
+
Know definition of algebraic multiplicity and geometric multiplicity
+
Know determinant and trace are related to eigenvalues
+
Be able to state and prove that every square matrix has a Schur factorization
+
Know normality is equivalent to unitary diagonalizability and in particular for hermitian matrices
+
Be able to describe the two stages of eigenvalue algorithms
+
Be able to describe power iteration via Rayleigh quotient for symmetric matrices
+
Know the convergence rates for eigenvalue/vector of  power iteration via Rayleigh quotient
+
Know what Ritz values and Ritz vectors are
+
Know the polynomial approximation problem associated with Arnoldi method
+
Know invariance properties of Arnoldi method
+
Know Ritz values from Arnoldi iteration are related to ``extreme'' eigenvalues
+
Know the polynomial approximation problem associated with GMRES
+
Know invariance and convergence properties of GMRES (monotone decrease in error, faster convergence from eigenvalue clustering)
+
Be able to state the steepest descent algorithm
+
Be able to state the residual norm related to the minimizing polynomial 
+
Understand residuals in CG are orthogonal and search directions are $A$-conjugate 
+
Know monotonic convergence of CG 
+
Be able to state the finite termination property of CG 
+
Know the polynomial approximation problem associated with CG
+
Be able to state the  PCG algorithm
+
  
(Useful to know)
+
Understand the CG algorithm and its properties
  
Know matrix inverse times vector as a change of basis operation
+
Know the v-cycle multigrid algorithm and the full multigrid algorithm
Know orthogonal vectors are linearly independent
+
Know definition of weighted vector norms and corresponding induced matrix norms
+
Know different definitions of Frobenius norm
+
Know positive singular values are distinct and singular vectors are unique up to complex signs
+
Know relation between null space of a projector and the range space of its complementary projector
+
Understand what machine epsilon is
+
Know uniqueness of QR factorization
+
Understand why one of the two Householder reflectors is a better choice
+
Be able to solve $Ax=b$ for  square $A$ with a known QR factorization
+
Know how to calculate the relative condition number of a general problem
+
Know how to determine stability and backward stability for simple problems
+
Know that QR factorization via Householder factorization is backward stable
+
Understand the difference between stability and accuracy
+
Know number of operation for LU factorization is  O (n^3)
+
Know number of operation for pivoting is  O (n^2)
+
Know how pivots are chosen in complete pivoting
+
Know how Cholesky decomposition is implemented
+
Know algebraic multiplicity is greater than or equal to geometric multiplicity
+
Know  defective eigenvalues and matrices
+
State operation count for reduction to upper Hessenberg form
+
Know the convergence rates for eigenvalue/vector of  inverse iteration
+
Understand the meaning of cubic convergence
+
Be able to describe simultaneous iteration
+
Know convergence of QR 
+
Understand loss of orthogonality in Lanczos may cause computational problem
+
Know bound for error of CG 
+
Be able to state the error bound of CG 
+
Know how preconditioning works (both Hermitian and non-Hermitian cases)
+
Construct (block) diagonal, incomplete LU and incomplete Cholesky preconditioners
+
  
(Nice to know)
+
Know how to construct  preconditioners: (block) diagonal, incomplete LU and incomplete Cholesky preconditioners
  
Know proof of existence and uniqueness of SVD
+
Know CGN and BCG and other Krylov space methods
Know oblique projector
+
Know the proof of  characterization of orthogonal projector
+
Know the operation count of Gram-Schmidt iterations
+
Know the operation count of Householder QR factorization
+
Know a projector separate tensor product of complex plane into two spaces
+
Know that solution of $Ax=b$  via QR is of order condition number time machine epsilon
+
Know that back substitution is backward stable
+
Be able to derive the conditioning bounds for a least squares problem when the matrix $A$ is perturbed.
+
Stability of Gaussian Elimination
+
Know what a growth factor is
+
Know linear solve via Cholesky decomposition is stable
+
Stability of reduction to Hessenberg form
+
Know QR algorithm is backward stable
+
Know Strassen's formula
+
Know Arnoldi lemniscates
+
Know physical interpretation of Lanczos iteration and electric charge distribution
+
CGN and BCG and other Krylov space methods   [important topics but you will not be tested on these]
+
  
 
=== Textbooks ===
 
=== Textbooks ===
Line 200: Line 142:
  
 
=== Additional topics ===
 
=== Additional topics ===
 +
Multigrid method;
 +
Domain decomposition method;
 +
Freely available linear algebra software;
 +
Fast multipole method for linear systems; Parallel processing
  
 
=== Courses for which this course is prerequisite ===
 
=== Courses for which this course is prerequisite ===

Latest revision as of 10:58, 14 November 2019

Catalog Information

Title

Numerical Methods for Linear Algebra.

Credit Hours

3

Offered

F

Prerequisite

Math 314, CS 142; or equivalent.

Description

Numerical matrix algebra, orthogonalization and least squares methods, unsymmetric and symmetric eigenvalue problems, iterative methods, advanced solvers for partial differential equations.

Desired Learning Outcomes

This course is designed to prepare students to solve linear algebra problems arising from many applications such as mathematical models of physical or engineering processes. Students are introduced to modern concepts and methodologies in numerical linear algebra, with particular emphasis on t methods that can be used to solve very large scale problems. In depth discussion of theoretical aspects such as stability and convergence will be used to enhance the students' understanding of the numerical methods. Students will also be required to perform some programming and computation so as to gain experience in implementing and observing the numerical performance of the various numerical methods.

The course addresses the University goal of developing the skills of sound thinking, effective communication and quantitative reasoning. The course also allow students, especially undergraduate students, to develop some depth and consequently competence in an important area of applied mathematics.

This course requires knowledge of higher level courses in mathematics. The course also serves as an introductory graduate level course to prepare the students to apply the methods learned in their research projects.

Prerequisites

Mastery of materials in an undergraduate course in linear algebra. Knowledge of matlab.

Minimal learning outcomes

(Must know)

Know properties of unitary matrices

Know general and practical definitions and properties of norms

Know definition and properties of SVD

Know definitions of and properties of projectors and orthogonal projectors

Be able to state and apply classical Gram-Schmidt and modified Gram-Schmidt algorithms

Know definition and properties of a Householder reflector

Know how to construct Householder QR factorization

Know how least squares problems arise from a polynomial fitting problem

Know how to solve least square problems using

 (1) normal equations/pseudoinverse, 
 (2) QR factorization and 
 (3) SVD

Be able to define the condition of a problem and related condition number Know how to calculate the condition number of a matrix

Understand the concepts of well-conditioned and ill-conditioned problems

Know how to derive the conditioning bounds

Know the precise definition of stability and backward stability

Be able to apply the fundamental axiom of floating point arithmetic to determine stability

Know the difference between stability and conditioning

Know the four condition numbers of a least squares problem

Know how to construct LU and PLU factorizations

Know how PLU is related to Gaussian elimination

Understand Cholesky decomposition

Know properties of eigenvalues and eigenvectors under similarity transformation and shift

Know various matrix decomposition related to eigenvalue calculation:

 (1) spectral decomposition
 (2) unitary diagonaliation
 (2) Schur decomposition

Understand why and how matrices can be reduced to Hessenberg form

Know various form of power method and what Rayleigh quotient iteration and their properties and convergence rates

Know the QR algorithm with shifts

Understand simultaneous iteration and QR algorithm are mathematically equivalent

Understand the Arnoldi algorithm and its properties

Know the polynomial approximation problem associated with Arnoldi method

Be able to state the GMRES algorithm

Be able to state three term recurrence of Lanczos iteration for real symmetric matrices

Understand the CG algorithm and its properties

Know the v-cycle multigrid algorithm and the full multigrid algorithm

Know how to construct preconditioners: (block) diagonal, incomplete LU and incomplete Cholesky preconditioners

Know CGN and BCG and other Krylov space methods

Textbooks

Possible textbooks for this course include (but are not limited to):

Lloyd N. Trefethen and David Bau III, Numerical Linear Algebra, SIAM; 1997; ISBN: 0898713617, 978-0898713619

James W. Demmel, Applied Numerical Linear Algebra, SIAM; 1997; ISBN: 0898713897, 978-0898713893

Gene H. Golub and Charles F. Van Loan, Matrix Computations, 3rd Ed., Johns Hopkins University Press, 1996; ISBN: 0801854148, 978-0801854149

William L. Briggs, Van Emden Henson and Steve F. McCormick, A Multigrid Tutorial, 2nd Ed., SIAM, 2000; ISBN: 0898714621, 978-0898714623

Barry Smith, Petter Bjorstad, William Gropp, Domain Decomposition: Parallel Multilevel Methods for Elliptic Partial Differential Equations, Cambridge University Press, 2004; ISBN: 0521602866, 978-0521602860

Additional topics

Multigrid method; Domain decomposition method; Freely available linear algebra software; Fast multipole method for linear systems; Parallel processing

Courses for which this course is prerequisite

Math 343, 410; or equivalents.