Latest news
Welcome to the course! The schedule for the course can be found in TimeEdit.
Problems of numerical linear algebra arise in many different fields of science like computational fluid dynamics, solid mechanics, electrical networks, signal analysis, and optimisation. In this course we study basic linear algebra concepts like matrix algebra, theory for linear systems of equations, spectral theory, and vector and matrix norms as well as numerical aspects like efficiency, reliability, error analysis and condition numbers. We will study following topics:
 For solving linear systems of equations we will present Gaussian elimination with different pivoting strategies and blocking algorithms.
 For leastsquares problems we study the method of normal equations, QRfactorisation and Singular Value Decomposition (SVD).
 The methods for solution of eigenvalue problems which are based on transformation techniques for symmetric and nonsymmetric matrices.
 The basic iterative methods (Jacobi, GaussSeidel and Successive overrelaxation (SOR)) for solution of linear systems.
 Introduction to the Krylov subspace methods.
By the completion of this course the students will be able to:
 use numerical linear algebra as building bricks in computation.
 make a linear algebra model of problems from the physical reality.
 derive and use the numerical techniques needed for a professional solution of a given linear algebra problem.
 use computer algorithms, programs and software packages to compute solutions to current problems.
 critically analyze and give advice regarding different choices of models, algorithms, and software with respect to efficiency and reliability.
 critically analyze the accuracy of the obtained numerical result and to present it in a visualized way.
 Here are some available projects for Master works:
 Master Project "Efficient implementation of Helmholtz equation with applications in medical imaging"
 Master Project "Optimal control of drugs in the mathematical model of dynamics of a tumorimmune system"
 Main paper for the project "Optimal control of drugs in the mathematical model of dynamics of a tumorimmune system"
 Master Project "Determination of parameters in kinetic modelling in positron emission tomography (PET)"
 Main paper for the project "Determination of parameters in kinetic modelling in positron emission tomography (PET)"
 Material for the project "Determination of parameters in kinetic modelling in positron emission tomography (PET)"
 Demmel, Applied Numerical Linear Algebra, SIAM 1997. List of Errata for the Textbook
 L. Beilina, E. Karchevskii, M. Karchevskii, Numerical Linear Algebra: Theory and Applications, Springer, 2017.
 Readme for Matlab programs
 Matlab programs
 PETSc libraries which are a suite of data structures and routines for the scalable (parallel) solution of scientific applications;
user manual
Program
Day Time Place Remarks MON 13:1515:00 Euler Lecture WED 13:1515:00 MVF24, MVF25 Computer exercises THU 10:0011:45 Pascal Lecture FRI 13:1515:00 MVF24, MVF25 Computer exercises 30.10.2018 14.0018.00 SB Examination ?.01.2019 14.0018.00 SB Examination  Lecture 1 Introduction and organization of the course. Introduction to linear algebra and numerical linear algebra. If this looks unfamiliar you should better consult you former literature in linear algebra and refresh your knowledges. We will concentrate on the three building bricks in Numerical Linear Algebra (NLA) : (1) Linear Systems, (2) Overdetermined systems by least squares, and (3) Eigenproblems. The building bricks serve as subproblems also in nonlinear computations by the idea of linearization. We considered example of application of linear systems: image compression using SVD, image deblurring. We introduced some basic concepts and notations: transpose, lower and upper triangular matrices, singular matrices, symmetric and positive definite matrix, conjugate transpose matrix, row echelon form, rank, cofactor.
 Lecture 2 IEEE system and floatingpoint numbers. We will discuss computer exercise 1 and perturbation theory in polynomial evaluation.
 Lecture 3 We will discuss why we need norms: (1) to measure errors, (2) to measure stability, (3) for convergence criteria of iterative methods. Sherman  Morrison formula. System of linear equations. Gaussian elimination, LUfactorization. Gaussian elimination (and LU factorization) by outer products (elementary transformations). Pivoting, partial and total. Repetition: Eigenvalues and eigenvectors, characteristic equation, spectral radius, diagonalization, orthoganality, invertible matrix.
 Lecture 4 We will discuss the need of pivoting and uniqueness of factorization of A. Different versions of algorithms of Gaussian elimination. Error bounds for the solution Ax=b. Roundoff error analysis in LU factorization. Estimating condition numbers. Hagers's algorithm.
 Notes from Lecture 4
 Lecture 5 Componentwise error estimates. Improving the Accuracy of a Solution for Ax=b: Newton's method and equilibration technique. Convergence of the Newton's method. Real Symmetric Positive Definite Matrices. Cholesky algorithm. Example: how to solve Poisson's equation on a unit square using LU factorization.
 Lecture 6 Band Matrices, example: application of Cholesky decomposition in the solution of ODE. Continuation of considering example: application of Cholesky decomposition in the solution of ODE. Linear least squares problems. Introduction and applications. The normal equations. Example: Polynomial fitting to curve. Solution of nonlinear least squares problems: examples.
 Lecture 7 QR and Singular value decomposition (SVD). QR and SVD for linear least squares problems. Example of application of linear systems: image compression using SVD in Matlab. Least squares and classification algorithms. We discussed computer exercises 2, 3.
 Lecture 8 Householder transformations and Givens rotations. QRfactorization by Householder transformation and Givens rotation. Examples of performing Householder transformation for QR decomposition and tridiagonalization of matrix. MoorePenrose pseudoinverse. RankDeficient Least Squares Problems. We discussed computer exercise 4.
 Lecture 9 Introduction to spectral theory, eigenvalues, right and left eigenvectors. Similar matrices. Defective eigenvalues. Canonical forms: Jordan form and Schur form, real Schur form.
 Lecture 10 Canonical forms: Jordan form and Schur form, real Schur form. Gerschgorin's theorem. Perturbation theory, BauerFike theorem. Discussed algorithms for the nonsymmetric eigenproblems: power method, inverse iteration, inverse iteration with shift, orthogonal iteration.
 Lecture 11 Discussed algorithms for the nonsymmetric eigenproblems: orthogonal iteration, QR iteration and QRiteration with shift. Hessenberg matrices, preservation of Hessenberg form. Hessenberg reduction. Tridiagonal and bidiagonal reduction. Regular Matrix Pencils and Weierstrass Canonical Form.
 Lecture 12 Regular Matrix Pencils and Weierstrass Canonical Form. Singular Matrix Pencils and the Kronecker Canonical Form. Application of Jordan and Weierstrass Forms to Differential Equations. Symmetric eigenproblems Perturbation theory: Weyl's theorem. Corollary regarding similar result for singular values. Application of Weyl's theorem: Error bounds for eigenvalues computed by a stable method. CourantFischer (CF) theorem. Inertia, Sylvester's inertia theorem. Definite Pencils.
 Lecture 13 Theorem that the Rayleigh quotient is a good approximation to an eigenvalue. Algorithms for the Symmetric eigenproblem: Tridiagonal QR iteration, Rayleigh quitient iteration, Divideandconquer algorithm. QR iteration with Wilkinson's shift. Divideandconquer, bisection and inverse iteration, different versions of Jacobi's method.
 Lecture 14 Algorithms for symmetric matrices (continuation): different versions of Jacobi's method. Algorithms for the SVD: QR iteration and Its Variations for the Bidiagonal SVD. The basic iterative methods (Jacobi, GaussSeidel and Successive overrelaxation (SOR)) for solution of linear systems. Jacobi, GaussSeidel and SOR for the solution of the Poisson's equation in two dimension. Study of Convergence of Jacobi, GaussSeidel and SOR. Introduction to the Krylov subspace methods. Conjugate gradient algorithm. Preconditioning for Linear Systems. Preconditioned conjugate gradient algorithm. Common preconditioners. Example of construction of a lower triangular matrix using Given's rotation.
 Computer Exercises
 Paper with real values of parameters for comp.ex.2 (see Table II, page 345 )
 Template for the report, pdf file
 Template for the report, tex file
 Poisson2D_LU.m
 DiscretePoisson2D.m
 BackSub.m
 ForwSub.m
 LU_factor.m Templates for programs written in PETSc:
 Template for computing of discretized Laplacian
 Template for the main program in PETSC for LU decomposition
 Example of Makefile to be able compile PETSc program at Chalmers
 Template for petsc function which removes zeros from the original matrix. We can work then with resulting matrix without zeros in the same way as with original matrix.
 Example of the main program in PETSC.
 Example of the Makefile to compile the above main program like that: make hello
 Homeworks and computer exercises Link to the homeworks:
 Homeworks One more time link to the computer ex.:
 Computer Exercises
 Paper with real values of parameters for comp.ex.2 (see Table II, page 345 ) Deadlines for homeworks and comp.ex.:
 Homework 1 and comp. ex. 1: 14 September
 Homework 2: 21 September
 Homework 3 and comp.ex. 2: 5 October
 Homework 4 and comp.ex. 3: 12 October
 Comp.ex. 4,5: 19 October
 1. Pass 2 computer assignments with written reports, see description of comp. assignments above.
 Template for the report, pdf file
 Template for the report, tex file
 2. Pass 2 compulsory home assignments.
 Written examination

 Final exam is compulsory, written.
 The theory questions will be choosen from the following list
 List of questions
 You should be able to state and explain all definitions and theorems given in the course and also apply them in problem solving.
 Grades are set according to the table below.
Grades Chalmers Points Grades GU Points  <15  3 1520 G 1527 4 2127 VG >27 5 >27 Solutions to the exam will be published at the link which will be written at this homepage. You will be notified the result of your exam by email from LADOK (This is done automatically as soon as the exams have been marked an the results are registered.)
The exams will then be kept at the students office in the Mathematical Sciences building.
Check that the number of points and your grade given on the exam and registered in LADOK coincide.
Complaints of the marking should be written and handed in at the office. There is a form you can use, ask the person in the office.).The following link will tell you all about the examination room rules at Chalmers: Examination room instructions
Examination procedures
In Chalmers Student Portal you can read about when exams are given and what rules apply on exams at Chalmers. In addition to that, there is a schedule when exams are given for courses at University of Gothenburg.
Before the exam, it is important that you sign up for the examination. If you study at Chalmers, you can do this from the Chalmers Student Portal, and if you study at University of Gothenburg, you sign up via GU's Student Portal.
At the exam, you should be able to show valid identification.
After the exam has been graded, you can see your results in Ladok by logging on to your Student portal.
At the annual (regular) examination:
When it is practical, a separate review is arranged. The date of the review will be announced here on the course homepage. Anyone who can not participate in the review may thereafter retrieve and review their exam at the Mathematical Sciences Student office. Check that you have the right grades and score. Any complaints about the marking must be submitted in writing at the office, where there is a form to fill out.At reexamination:
Exams are reviewed and retrieved at the Mathematical Sciences Student office. Check that you have the right grades and score. Any complaints about the marking must be submitted in writing at the office, where there is a form to fill out.Old exams
Old exams given before 2012/2013 are presented at the following link: Old exams
Year Answers 2012/2013 Exam 1 2012/2013 Exam 2 2013/2014 Exam 1 2013/2014 Exam 2 2014/2015 Exam 1 2014/2015 Exam 2 2015/2016 Exam 1 2016/2017 Exam 1 2016/2017 Exam 2 2017/2018 Exam 1 2017/2018 Exam 2
Teachers
Course coordinator: Larisa Beilina
larisa@chalmers.se, room 2089, homepageCourse literature
Computer labs
Wednesdays 1315 (except the first week) in computer room MVF24, MVF25 (Mathematical Sciences in Physics building).
Fridays 1315 in computer room MVF24, MVF25.
Computer labs in Matlab/PETSc can be found below.
Usefull Matlab programs for computer exercises:
Additional templates of PETScMakefile and main program in PETSc:
Reference literature:
Learning MATLAB, Tobin A. Driscoll ISBN: 9780898716832 (The book is published by SIAM).
Course requirements
The learning goals of the course can be found in the course plan . The following chapters of the textbook L. Beilina, E. Karchevskii, M. Karchevskii, Numerical Linear Algebra: Theory and Applications, Springer, 2017. will be considered during the course: 6  12. The homework assignments are such you could expect at the examination. In the computer exercises you will be trained in using different algorithms of numerical linear algebra using MATLAB or PETSc libraries. Either you will use already existing MATLAB and PETSC programs available for a free download from the homepage of the book, or write your own MATLAB or PETSc codes. Both, homework and computer assignments give bonus credit points for the last written examination. Since the occasions reserved for the computer exercises are without supervision you may put questions on these exercises to me at the lectures or come directly to my office during time for the computer exercises.
Assignments
Wednesdays 1315 in computer rooms MVF24, MVF25 (Mathematical Sciences at Physics building)
Fridays 1315 in computer rooms MVF24, MVF25.
Examination
To pass this course you should pass the written examination. Before the final written examination you should: