- [20 August] 20 August exam and solutions
- [26 April] 25 April exam and solutions
- [18 January] exam and solutions
- [19 December] a formula sheet that will be provided at the exam;
- [19 December] a list of topics (and hence potential exam questions)
- [13 December] final project has been posted. See the table below in week 50. Deadline for hand-in is 16.00 on 14 January 2019
- [12 December] exam info: Allowed on the exam is your own Chalmers approved calculator.
- [4 December] exam info: consultation of notes, books and old exams will NOT be allowed during the exam.
- [2 December] read the minutes of the mid-course meeting with students representatives
- there will be no mini presentations in week 49. Please disregard the 15.00-17.00 session on Thursday 6 December. Lectures will however follow the usual calendar.
- A syllabus is available (version 26 October)
- The current timetable is at https://cloud.timeedit.net/chalmers/web/public/ri1X50gQ4560YvQQ05Z6870Y0Zy6007345Y51Q685.html
Minor modifications to the above might occurr and will be promptly communicated.
Course coordinator: Umberto Picchini
Office hours: on Tuesdays at 15.00-16.00 you can either visit Umberto Picchini (in H3030) or Juan Inda Diaz (in L3062). If you see a queue outside one of the offices try the other one!
Teaching assistants: Juan Inda Diaz
Student representatives 2018-19 are (emails should be followed by @student.chalmers.se)
Anton Björk, email: anbjork
Stefan Kittler, email: kittler
Linus Lagergren, email: linlag
a monography for linear models: J.O. Rawlings, S.G. Pantula, D.A. Dickey. Applied Regression Analysis. This is available as ebook. Notice we will not follow this book. It is only a suggested resource if there is need for further support.
a monography for generalised linear models: Alan Agresti. "Categorical Data Analysis", Wiley. This is available as ebook. Notice we will not follow this book. It is only a suggested resource if there is need for further support.
Lecture notes by Prof. Rebecka Jörnsten will be posted on this page.
Software: We will use the statistical package R to analyze data, powered via the Rstudio interface. You will need to install both on your computer, see the instructions.
NO previous knowledge of R is required. But you are enouraged to attend the lab on Thursday 8 November. No further supervised computer lab will be given. Attending the lab on Thursday 8 November it will give you a useful opportunity for you to form groups for the MiniAnalyses.
Here follows a rough list of topics covered in the course: linear models and underlying assumptions; the bias/variance tradeoff; properties of least squares estimators. Outliers and diagnostics. Residuals and leverage. Variance decomposition and the R-squared. Hypothesis testing, global F test; confidence and prediction intervals. Multivariate regression. Multicollinearity. The Hat matrix. Model selection via backward/forward/stepwise procedures. Prediction error, cross validation and Mallow's Cp. Categorical covariates. Interactions. Regression via regularization. Bayesian regression.
Lecture notes, codes and MiniAnalyses
lecture 1: Notes1.pdf, Lecture1.R,
lecture 2: Demo1.R, sleeptab.dat
Notes2.pdf, Lecture2.R, slides_2.pdf
There will be a computer lab 8 Nov at 15.15! Lab material: lab0.pdf, lab1.pdf, repair1.txt
|See the notes and the slides||Mini1.pdf due 16 November. kc_house_data.csv, TV.dat|
lecture 3: Notes3.pdf,
lecture 4: Notes4.pdf, slides_4.pdf, Demo3.R, Lecture4.R
lecture 3: we completed Notes2.pdf that is, besides what's in slides_3.pdf, we covered diagnostics and the leverage. Then started Notes3.pdf with expectation/variance of the predictions; expectation/variance of residuals; standardised residuals.
lecture 4: we completed Notes3.pdf and started Notes4.pdf without the F-test and up to section 3.2 included
|Mini2.pdf due 23 November.|
lecture 5: Notes5.pdf, slides_5.pdf, Demo4.R, multicollinear.R,
lecture 6: Notes6.pdf, Notes7.pdf, slides_6.pdf, global+partial_Ftests.R, Lecture6.R, Lecture7.R SA.dat, cars.dat (previously missing)
lecture 5: confidence and prediction intervals in smple linear regression; multivariate regression, matrix notation and model construction. Least squares estimation and properties of the parameter estimates. Sampling distributions.
Confidence and prediction intervals. T-test. Global F test.
lecture 6: partial F test; greedy search via backward selection; back to the bias/variance tradeoff
|Mini3.pdf (updated) due 30 November.|
Lecture 7: Notes10.pdf, Notes11.pdf,
slides_7.pdf, auction.R, auction.dat
Lecture 8: slides_8.pdf, Lecture7.R (new version), rsquaredAICstep.R, AkaikeEasyIntro.pdf
Lecture 7: training and testing data; all-subsets regresion; estimation of pMSE; started talking of categorical covariates and their parametrization;
Lecture 8: more on categorical variables; models with categorical covariates and numerical ones; interactions; Kullback-Leibler discrepancy, likelihood function and information criteria (AIC, BIC); back to R-squared and adjusted R-squared
|no mini released for next week|
Lecture 9: Notes8.pdf, Lecture8.R,
Lecture 10: Notes17.pdf, Lecture17.R; slides_10.pdf, (optional) see the book by Agresti, chapters 4 and 14.4; poissregr-awards.R, poisson_sim.csv, poisson_nb.R, f10.txt, f10b.txt
Lecture 9: K-fold cross-validation; leave-one-out cross validation; choosing K and consequences on prediction variance; brushing-up diagnostics (again leverage and residuals) and introducing new tools like
studentised residuals, Cook's distance, DFBETAs
Lecture 10: introduced generalised linear models (GLMs) and the exponential family; maximum likelihood estimation; asymptotic properties of the MLE and hence asymptotic properties of estimators for GLMs; Poisson regression;
|mini4.pdf due Friday 14 december|
Lecture 11: slides_11.pdf,
poissregr-awards.R (updated), poisson_nb.R (updated),
Lecture 12: slides_12.pdf, poisson_nb.R (updated), shipdamage.R (updated)
Lecture 11: testing over/under-dispersion for Poisson regression; standard erros and confidence intervals for GLMs; Wald test; deviance; likelihood-ratio test; Poisson regression with an offset term;
Lecture 12: negative binomial distribution and regression; using an offset term with negative binomial regression; diagnostics for GLMs and their limitations (leverage; sveral types of residuals; Cook's distance)
|project18.pdf, data18.txt, deadline 14 January at 16.00|
||Lecture 13: slides_13.pdf, USpresidentsHeights_Stan.R, USheights_model.stan, carsdata_withpriors.R, carsdata_withpriors.stan||Lecture 13: this is the last lecture and will consider Bayesian methods. The material covered is NOT part of the examination.|
The learning goals of the course can be found in the course plan.
MiniAnalyses are computer exercises where you will have a week to perform an analysis task and prepare slides to present in class on FRIDAYS. It is mandatory to present a few times (we'll decide on how many times depending on this year's number of students). You must work in teams. Teams will be selected at random on Friday Mini sessions. We discuss your findings in class.
Slides should be emailed to Umberto Picchini ahead of class or brought on a USB stick or be made available online through
If you do not attend Friday sessions (I will take the presences) you will have to write a report, which is a lot more work!
The Minis contribute 10% to your final grade AND the minis will also be part of your final project exam. The project counts for 40% of your grade and a written final for 50%.
In Chalmers Student Portal you can read about when exams are given and what rules apply on exams at Chalmers. In addition to that, there is a schedule when exams are given for courses at University of Gothenburg.
Before the exam, it is important that you sign up for the examination. If you study at Chalmers, you will do this by the Chalmers Student Portal, and if you study at University of Gothenburg, you sign up via GU's Student Portal, where you also can read about what rules apply to examination at University of Gothenburg.
At the exam, you should be able to show valid identification.
After the exam has been graded, you can see your results in Ladok by logging on to your Student portal.
At the annual (regular) examination:
When it is practical, a separate review is arranged. The date of the review will be announced here on the course homepage. Anyone who can not participate in the review may thereafter retrieve and review their exam at the Mathematical Sciences Student office. Check that you have the right grades and score. Any complaints about the marking must be submitted in writing at the office, where there is a form to fill out.
Exams are reviewed and retrieved at the Mathematical Sciences Student office. Check that you have the right grades and score. Any complaints about the marking must be submitted in writing at the office, where there is a form to fill out.
Old exams: April 2019, January 2019, Final Jan 2018 (includes solutions in italic), Final Jan 2017, Final Aug 2016, Final Jan 2016
Solutions: April 2019, January 2019, Final Jan 2017 (merged exam and solutions), Final Aug 2016, Final Jan 2016