Göteborgs universitet

Matematisk statistik

#### TIDSSERIER (Anm kod: MSN09)

**The spring 2001 course is read by** **Catalin
Starica**

## The spring 1999 course

A 5 point undergraduate statistical course based on the textbook "Introduction to Time Series and Forecasting"
by P.Brockwell and R.Davis (Springer, 1996).
A time series is a set of (dependent) observations recorded over some period
of time.
The need of analysing and forecasting such data
sets arises in economics, engineering and the natural and social
sciences. This course is intended to give a working knowledge of basic
methods for linear model selection and forecasting.

**Prerequisite:** basic courses on probability and statistics, calculus and linear algebra.

**Instructor:** Serik Sagitov (room
1420; tel. 772 53 51; http://www.math.chalmers.se/~serik/)

**Grading system:** Home assignments give max. 5 units and a final
written exam brings max. 25 units. To get a ``godkänd''
grade a student should collect min. 12 units and the ``väl godkänd''
threshold is 22 units.

**Formula list** in the Acrobat-format.

**Links**
**Lecture 1.** Course plan.

**Lecture 2.** Chapter 1. Introduction.

1. Examples of stationary processes.

2. Linear filters.

3. IID noise testing.

**Home assignment.** Problems: 1.1, 1.4, 1.7, 1.13, 1.14, 1.15.

**Lecture 3.** Chapter 2. Stationary processes.

1. Strict and weak stationarity.

2. The autocovariance and sample autocovariance matrices.

3. Linear processes.

4. Estimation of the process mean and autocorrelation function.

**Leccture 4.** Chapter 2. Stationary processes.

1. Linear prediction.

2. The Durbin-Levinson algorithm.

3. The innovations algorithm.

4. Wold's decomposition.

**Home assignment.** Problems: 2.1, 2.2,2.11, 2.14, 2.21.

**Lecture 5.** Chapter 3. ARMA models.

1. ARMA models. Causality and invertibility.

2. The autocorrelation and partial autocorrelation
functions of ARMA.

3. ARMA innovations.

**Lecture 6.** Chapter 4. Spectral analysis.

1. Spectral representations.

2. Linear filters and transfer functions.

3. Spectral density estimation.

**Home assignment.** Problems: 3.1 (a-c), 3.5, 3.11; 4.9, 4.10.

**Lecture 7.** Chapter 5. Modelling and forecasting with ARMA processes.

1. The Yule-Walker estimation.

2. Burg's algorithm.

3. The innovations algorithm.

4. Hannan-Rissanen algorithm.

**Lecture 8.** Chapter 5. Modelling and forecasting with ARMA processes.

1. Gaussian likelihood and the maximum likelihood estimation.

2. Diagnostic checking and forecasting.

3. Order selection. AICC criterion.

**Home assignment.** Problems: 5.1, 5.2, 5.3, 5.4.

**Lecture 9.** Chapter 6. Nonstationary and seasonal time series
models.

1. ARIMA models.

2. Identification techniques.

3. Unit roots in time series models.

**Lecture 10.** Chapter 6. Nonstationary and seasonal time series models.

1. Forecasting ARIMA models.

2. Seasonal ARIMA models.

3. Regression with ARMA errors.

**Home assignment.** Problems: 6.1, 6.6, 6.11, 6.12 (a, b).

**Lecture 11.** Chapter 7. Multivariate time series.

1. Cross-correlation functions.

2. Testing for independence of two stationary time series.

3. Multivariate ARMA models.

**Lecture 12.** Chapter 7. Multivariate time series.

1. Modelling and forecasting with multivariate AR processes.

2. Discussion of the home assignment problems 6.6, 6.11, 6.12 (a, b).

**Home assignment.** Problems: 7.1, 7.3, 7.5.

**Lecture 13.** Chapter 8. State-space models.

1. The observation and state equations.

2. The basic structural model.

3. State-space representation of ARIMA models.

**Lecture 14.** Chapter 8. State-space models.

1. The Kalman recursions.

2. Chapter 10.3. ARCH and GARCH models.

**Home assignment.** Problems: 8.3, 8.4, 8.9.

**Lecture 15.** Course summary. Discussion of selected home assignment problems.

Last modified: Tue Feb 20 03:47:23 MET 2001