Numerical Linear Algebra
ENMTMA265 and GUMMA600
7.5 credit points
september  october 2011
 See you at the course start on Wednesday August 31, Time 13.15 in Room MVF21 at Mathematical Sciences  Physics building.
Some words about the course
Numerical linear algebra problems arise in many different fields of science like computational fluid dynamics, solid
mechanics, electrical networks, signal analysis, and optimisation.
In this course we
study basic linear algebra concepts like matrix algebra, theory for linear systems of equations, spectral theory, and vector and matrix norms as well as numerical aspects like efficiency, reliability, error
analysis and condition numbers.
We consider three linear algebra building bricks in computation:

For solving linear systems of equations we present
Gaussian elimination with different pivoting strategies and blocking algorithms for higher performance using BLAS (basic linear algebra subroutines).

For leastsquares problems we
study QRfactorisation and singular value decomposition.

The methods for eigenvalue
problems are based on transformation techniques for symmetric and nonsymmetric
matrices.
For all three building bricks above we discuss numerical algorithms with respect to applicability, reliability, accuracy, and efficiency.
By computer exercises the students get experiences in
implementation and evaluation of numerical algorithms for linear algebra problems.
For some learning outcomes of this course, see outcomes.
Course Literature
Lectures
Mondays 1315 (except the first week when this lecture is on Wednesday instead) in room MVF33 (Mathematical Sciences  Physics building)
Thursdays 912 in MVF33 (Mathematical Sciences  Physics building).
Computer Exercises, without supervision
Wednesdays 1315 (except the first week) in computer room MVF24 (Mathematical Sciences  Physics building)
Fridays 1315 in computer room MVF24 (exept the first week when it is in room MVF25)
Outline
The following sections in the textbook will be considered. Chapters 6 and 7 are saved for the course to be given in the third quarter: Large Sparse Matrix Problems.
Chapter 1, mainly repetition I hope. I will also give a brief revision on basic linear algebra.
Chapter 2: 1  6, 7 except 2.7.4 and 2.7.5
Chapter 3: 1  5
Chapter 4: 1  4
Chapter 5: 1  4
The following questions in the textbook are recommended:
Chapter 1: 1, 2, 3, 4, 5, 7, 13 ( 6 as homework assignment and 20 as computer exercise)
Chapter 2: 3, 6, 7, 10, 11, 12, 17, 19 (18 and 20 as homework assignments)
Chapter 3: 1, 2, 3(parts 1,2,3), 4, 5, 6, 8, 9, 11, 14, 15, 18 (7 as homework assignment)
Chapter 4: 1, 3, 4, 5, 6, 7, 10, 11, 12, 13 (2 and 8 as homework assignments and 15 as computer exercise)
Chapter 5: 1, 3(for symmetric matrix), 6, 7, 14, 15, 16, 17, 18, 22, 27, 28 (2 and 5 as homework assignments and 13 as computer exercise)
To each chapter belong an optional homework assignment and a compulsory computer exercise.
The homework assignments are similar to the questions in the textbook and such you could expect at the examination.
In the computer exercises you will be trained in using different algorithms in numerical linear algebra. Either you will use already existing MATLAB programs or write your own small MATLAB codes.
The homework assignments give bonus credit points for the examination, well performed one per homework assignment. The examination contains 25 points and I usually require 13 points for passing.
Since the occasions reserved for the computer exercises are without supervision you may put questions on these exercises to me at the lectures. To some extent I might also help you with the homework assignments.
Passed computer exercises will be graded with grades 3, 4, or 5, see examination below.
Detailed program on line
August 31
Introduction:
 Read chapter one in the textbook yourselves. It is an introduction to numerical analysis and computer arithmetics. The first computer exercise is based on this stuff.
 My introduction to linear algebra and numerical linear algebra. If this looks unfamiliar you should better consult you former literature in linear algebra and refresh your knowledges. The first homework assignment is based on this stuff.
I have talked about the three building bricks in NLA: (1) Linear Systems, (2) Overdetermined systems by least squares, and (3) Eigenproblems.
The building bricks serve as subproblems also in nonlinear computations by the idea of linearization. Examples: Newtons method for solving nonlinear systems of equations and projection on linearized constraints in nonlinear optimization.
We introduced some basic concepts and notations: transpose, inverse, scalar (inner) product, outer product, and LUfactorization.
I warned againts computing the explicit inverse when solving a linear system of equations.
We discussed calculating the determinant by LUfactorization, solving a linear system by Gaussian elimination (backslash in MATLAB) and computing the characteristic equation by first computing the eigenvalues.
We introduced the concepts symmetric and orthogonal matrix, positive (negative) definite and positive (negative) semidefinite matrix, indefinite matrix.
Bandwidth of a matrix, in particular tridiagonal matrices and upper Hessenberg matrices.
September 1
We discussed the first HA and CE and I presented examples of requested handingraphics.
I continued the introduction:
submatrices, principal submatrices and leading principal submatrices.
eigenvalues and eigenvectors, characteristic equation, spectral radius.
diagonalization, spectral theorem.
orthoganality, orthogonal complement, orthogonal projection, orthogonal components.
Basic theory for linear systems: nullspace, rangespace, rank, consistence, uniqueness.
Block matrices.
Matrix multiplication by inner and outer products.
Introduction to HA1b: 2 by 2 block triangular matrices and their inverses, low rank corrections, ShermanMorrison formula.
I gave you five Exercises on my introduction so far. If you want the solutions to these consult my lecture notes on request.
We discussed why we need norms: (1) meashuring errors, (2) meashuring stability, (3) convergence of iterative methods, and (4) convergence property for fix point iteration.
Vector norms
September 5
matrix norms
I have demonstrated the solution of questions Q1.7 and Q1.13
Chapter 2: System of linear equations.
Gaussian elimination, LUfactorization.
Gaussian elimination (and LU factorization) by outer products (elementary transformations).
Please, the overheadslides I showed. slide 1,slide 2,slide 3, slide 4.
Pivoting, partial and total.
September 8
We discussed HA2 and CE2.
More om pivoting.
Growth factor, small numerical example.
Special matrices:
 Diagonally dominant,
 Symmetric positive definite
 Symmetric indefinite by 2x2 pivots
 Bandmatrices
 General sparse matrices in the course "Large Sparse Matrix Problems", given next spring, quarter 3.
Perturbation theory, condition number.
The smallest distance to a singular matrix, a small example.
September 12
Componentwise error estimates, BauerSkeel condition number, a small numerical example.
Error bound using the residual.
Backward analysis.
"A posteriori" error estimates.
Estimating condition number.
"A priori" error estimates.
Reading adwise regarding perturbation theory and error analysis:
You should understand the ideas, not remember details.
Iterative refinement.
I demonstrated the solution of Q2.10.
September 15
Section 2.6. Blocking algorithms:
The importance of regarding an algorithm AND a computer for high performance computing.
LAPACK, BLASroutines
Hierarchy of memories
Examples from implementation of matrix addition and matrix multiplication.
Gaussian elimination with blocking and basicly with BLAS3 routines, delayed updating.
Chapter 3: Linear least squares problems.
Introduction and applications.
Linear models and nonlinear models. Statistical models.
A comment on other norms than the 2norm.
The normal equations, geometric interpretation.
The generalized inverse.
September 19
The normal equations for least square problems (cont.)
Normal equations of second kind
Numerical aspects on computability and stability.
Perturbation theory. Diskussion on condition numbers.
QRfactorization for linear least squares problems
Underdetermined problems by QRfactorization.
A comparision between the normal equations and QRfactorization in a concrete application: Polynomial fitting.
September 23
QRfactorization and its relation to the fundamental subspaces.
I demonstrated the solution of Q3.3, part 1 and 3.
Singular value decomposition (SVD).
SVD and the fundamental subspaces.
SVD for linear least squares problems.
Numerical rank, truncated SVD, condition number, best approximation of a matrix, truncated pseudoinverse, truncated least squares.
Applications: image compressing, signal processing and model reduction.
Least squares with linear equality constraints.
Total least squares
September 26
I gave important comments on backslash vs inversion and on measuring errors by the residual.
QRfactorization by Householder transformations.
I have demonstrated the solution of Question 3.9.
Chapter 4, Introduction to spectral theory, eigenvalues, right and left eigenvectors.
Similar matrices.
Spectral decomposition
Right invariant subspaces, deflation.
September 29
Multiplicity, algebraic and geometric, defective eigenvalues.
Canonical forms: Jordan form and Schur form
Bounds on eigenvalues, Gerschgorin's theorem.
I demonstated the solution of Q1.19 a.
Perturbation theory, BauerFike theorem, Theorems 4.4, 4.5, and 4.6
Algorithms for nonsymmetric eigenproblems:
The power method.
Deflation for symmetric problems.
Inverse iteration, invers iteration with shift,
October 3
Rayleigh quotient iteration
Orthogonal (simultaneous) iteration.
QRiteration, QRiteration with shift
Transformation to Hessenberg form by Householder reflections or Givens rotations.
QRiteration on Hessenberg matrices, preservation of Hessenberg form.
Choice of shift in Hessenberg QRiteration, comment on double shift, stopping criterion.
October 6
Hessenberg implicit shift QR method (the ultimate method).
Bulgechasing with Givens rotatiions.
Computation of eigenvectors in the Hessenberg QRmethod.
I have demonstrated the solution of Question 4.12.
Chapter 5, Symmetric eigen problems
Perturbation theory: Weyl's theorem
Corollary regarding similar result for singular values
Application of Weyl's theorem: Error bounds för eigenvalues computed by a stable method.
CourantFischer (CF) theorem
Application of CF theorem: Cauchy interlace theorem.
Application of BauerFike theorem on a symmetric matrix:
A posteriori error bounds using the residual.
A result telling that the Rayleigh quotient is a good approximation to an eigenvalue.
Algorithms for symmetric eigen problems
1. Tridiagonal Householder implicit QR iteration,
single shift and Wilkinson's shift
October 10
Algorithms for symmetric matrices (cont)
2. Rayleigh quotient iteration
3. Divide and conquer method,
secular equation
4. Spectrum slicing  bisection,
congruent transformation, inertia.
5. Jacobi's method
October 13
Classical and cyclic versions of Jacobi's method.
Convergence proof for Jacobi's method
Example on Jacobi's method
Methods for singular value decomposition
Reduction to bidiagonal form
Bidiagonal SVD to trdiagonal egenproblems in three ways
LR iteration for bidiagonal matrix
Onesided Jacobi's method
I solved the questions 5.22 and 5.18.
Summary of the course
OBS! In parentheses means: Will not be examined.
Chapter 1: Basic concepts in linear algebra
Matrix algebra, scalar product, inverse, symmetric, orthogonal, positive definite, bandwidth, submatrices, orthogonal projection, orthogonal complement, eigenvalues, left and right eigenvectors, characteristic equation, spectral radius, Rayleigh quotient.
Diagonalization, spectral theorem, nullspace, rangespace, rank.
Linear systems, consistence, uniqueness, homogeneous.
Block matrices, block factorization.
Schur complement, lowrank corrections.
Matrix and vector norms.
Complex matrices, Hermitean, unitary.
Multiplicity of eigenvalues, algebraic and geometric, efective eigenvalue,
Canonical forms: (Jordan) and Schur.
Similar transformations
Invariant subspaces, deflation.
Bounds for eigenvalues, Gerschgorin.
Triangular matrices, tridiagonal matrices, Hessenberg form.
Chapter 2: Systems of linear equations
Gaussian elimination with elementary transformations  outer product formulation.
LUfactorization, Jordan elimination.
Partial pivoting, growth factor.
Special matrices:
Diagonally dominant
Symmetric, positive definite, Cholesky factorization,
Symmetric indefinite, 2 by 2 pivot blocks
Perturbation theory, absolute and relative, condition number, backward analysis, stability of an algorithm, estimating condition number, iterative refinement.
Blocking algorithms, block matrix multiplication
BLAS, Gaussian elimination with BLAS3 level algorithms.
Chapter 3: Linear least squares problems
Overdetermined systems, normal equations of first and second kind, underdetermined systems.
QRdecomposition
SVDdecomposition, also for rank defficient case. Truncated SVD, numerical rank, truncated least squares, pseudoinvers.
Least squares with linear equality constrants.
How to perform the QRfactorization: GrammSchmidt, Householder reflections, Givens rotations.
(Perturbation theory)
Chapter 4: Nonsymmetric eigenproblems
Perturbation theory, condition number for eigenvalues, BauerFike theorem.
Power method, inverse iteration, Rayleigh quotient iteration.
Orthogonal iteration, QR iteration with shift. QR on Hessenberg form with explicit and implicit shift.
Preprocessing to Hessenberg form by Householder reflections or Givens rotations.
Chapter 5: Symmetric eigenproblems
Perturbation theory, Weyl's theorem, (CourantFischer).
Congruent transformations, inertia, spectrum slicingbisection.
Tridiagonal QR iteration.
(Rayleighquotient iteration)
Divide and conquer.
Jacobi's method.
Algorithms for SVD.
Reduction to bidiagonal form B
SVD of B related to eigenvalues of tridiagonal T in three ways
LR iteration on bidiagonal matrix.
Onesided Jacobi's method.
Material for the course
Examination
Date for the written examination will be decided by me and the students.
Means of assistance at the examination: none.
Passed examination is graded with grades 3, 4, or 5 and this grade is weighted together with the grades of the computer exercises to give a final grade for the course.
Examination:
Fri October 21, 9.0013.00,
Room MVF33.
Please find the examination and the solution.
Mon October 24: The examination is marked. The result will be sent to you.
It takes some days for the student's office to settle everything, so if you want you result right now you have better contact me.
Reexamination
If you are interested in reexamination (omtenta), please contact me as soon as possible so we can decide for a date.
Reexamination: Fri December 9, 9.0013.00 in MVL21.
Please find the examination and the solution.
Reexamination: Mon January 16, 13.1517.15 in MVL14.
Please find the examination and the solution.
Former examinations:
October 23, 2008: examination and the solution.
November 17, 2008: examination and the solution.
October 23, 2009: examination and the solution.
December 2, 2009: examination and the solution.
October 22, 2010: examination and the solution
December 17, 2010: examination and the solution
Succeeding course:
Large Sparse Matrix Problems
Runs the third quarter.
The course home page (last time the course was given, spring 2011).