Schedule for reading week 2 (week 37): tuesday 13:15-15 MVF23,wednesday 10-11:45 MVL22, thursday 13:15-15 (MVL22)
Schedule for reading week 3 (week 38): wednesday 10-11:45 (discussion of network design problem), thursday 13:15-15
Schedule for reading week 4 (week 39): tuesday 13:15-15, thursday 13:15-15 guest lecture by Michael Patriksson.
Schedule for reading week 5 (week 40): tuesday 13:15-15 guest lecture by Ann-Brith Strömberg, FCC, wednesday 10-11:45 (presentation and discussion of project 1).
Schedule for reading week 6 (week 41): tuesday 13:15-15 wednesday 10-11:45
Schedule for reading week 7 (week 42): wednesday 13:15-15
Schedule for reading week 8 (week 43): wednesday morning: announcement of competition problem. 15:00 discussion and presentation of project 2.
Location: MVL22.
Important dates and deadlines:
Deadlines for project 1 program: 23 september, report: 30 september, discussion session of project 1: wednesday 3 oktober.
Deadlines for project 2: program and report: 21 oktober, competition and discussion 23 oktober.
For more difficult problems, we review classic relaxation methods. We investigate how to prove that a problem is difficult by relating it to a known difficult one by means of "polynomial transformations". We take a closer look at the discrete network design problem, and discuss its properties. Literature: Notes.
For a basic problem in the field of facility location, we take a closer look at the workings of a Lagrangian relaxation method in the literature. Literature: Barcelo.
Downloads for Lecture 1: PS or PDF-format.
On the network design problem: PS or PDF-format.
Downloads for Lecture 2: PS or PDF-format.
We present methodologies for solving Lagrangian dual problems, in particular the classes of subgradient optimization and bundle algorithms. Literature: The Book.
Downloads for Lecture 2-3: PS or PDF-format.
For applications to integer programming, we specialize Lagrangian duality. In particular, we discuss the strength of different relaxations relate Lagrangian relaxation to continuous relaxations and the convexification of the original problem. Literature: Notes.
Downloads for Lecture 3-4: PS or PDF- format.
We discuss the topic of primal recovery, that is, how to obtain an optimal solution to the primal problem. We discuss the use of Lagrangian heuristics as well as the use of ergodic sequences and master problems, thereby touching already now the subject of price decomposition, Dantzig-Wolfe decomposition and column generation. Literature: Notes, Larsson, Patriksson, and Strömberg (1999).
Downloads for Lecture 4-5: The paper by Larsson, Patriksson, and Strömberg is found here in PDF format.
Downloads for Lecture 5: PS or PDF-format. Warning! The PS file is big!
Downloads for Lecture 5: The paper by Larsson and Patriksson is found here in PDF format.
Downloads for Lecture 6-8: PS or PDF-format.
Downloads for the lecture by Ann-Brith Strömberg (2 parts): PDF1 and PDF2.
This lecture will probably be replaced by a lecture about "Cost Propagation for Optimization problems", the research project/Licentiate thesis of Birgit Grohe.
The exam will be arranged thus: Apart from questions on the projects, and any side questions that may be relevant, the main part of the exam will consist of questions from the above study material. A selection will be made at random among them, making sure that the three (3) questions selected will cover the course material well enough and also include both grade 4 and 5 questions. (A "question" refers to a chapter in the study material.)
The project description is found here: Swedish PS, Swedish PDF-format, or English PS, English PDF-format.
The problem files are here: gsp.c, sph.c, visagrid.m, p6.m, p10.m, p11.m. (The file gsp.c should be compiled in Matlab ("mex gsp.c") in order to create the gsp command.)
Important dates: Deadline for the Matlab program and its description: Date will be published later. Deadline for complete LaTeX-based report, with Matlab-supported graphics of the best feasible solution for each problem, convergence charts for the subgradient algorithm (together with a discussion on your experience with using it!), and discussions on your feasibility heuristics, its complexity, and your experience with it (for example, did it always manage to find a feasible solution?): Date will be published later
Based on the codes available, you are to code your own method in which you can optimally choose parameters based on a set of test problems. Recommended means are to use the C code to generate good Lagrangian dual solutions on file, import them to Matlab together with the problem file, and then construct your heuristic entirely within Matlab. (The algorithm does of course not have to be constructed based on the Lagrangian dual solution!) If you are familiar with Cplex and C, you can instead add the corresponding lines of code to the C program given.
The competion then is as follows: Once all groups have finished coding and adjusting their algorithms, a small set of new test problems shall be solved with the current parameter settings fixed. Two winners will be announced: the group that on average reaches the best solutions within a fixed CPU time, and the group that on average reaches the final solution the quickest, under a solution quality constraint.
The master's thesis describing the C code is found here: PDF-format.
The source code and data files used in the thesis are found in the following four tar-files: Bundle, Examples, Hidden, and Subgradient.
IMPORTANT NOTE! The above code works with so-called "dense" matrix algebra, which means that the matrix A is stored in dense form. This is ok when the size of the problem is small but for huge problems the matrix-vector multiplications involved in the algorithms to be used would take forever. There is a "sparse" form of the storage of the matrix and of the computations that take into account that the matrix A normally, for huge problems, contains perhaps 5 % or less non-zero elements. In order to work with the larger files you should also download the following tar-file and extract the corresponding programs in a special sub-directory for sparse computations: Sparse Bundle.
For a file bla.tar, do 'tar xvf bla.tar' and you should get a subdirectory called Bla. Use the instructions in the thesis to find everything you need; look especially at Section 5 (5.4.1, 5.5 in particular), and the numerical results found in Appendix D. Finally, use the following link to find further test problems: Beasley's OR-library on the set covering problem. Please also browse the web for further problems that you may test.
Note that unless you have set the path for the programs 'bundle' and 'sg', you must run them at their respective location using the commands './bundle' and './sg', otherwise unix will not find them.
The best team will be awarded a nice price!
The TRULY final problem to be solved is the second largest (in the size n) of the problems you have encountered AND which the bundle code together with CPLEX can handle! You must all agree on which problem this is! :-)
BONUS problem! You are also requested to run your solver, with the same settings, on the problem scpcyc10.txt, whose input format is given here. Why? Because it is a problem specifically constructed to give primal heuristics a hard time, it is truly large, and I would like to see how your solver copes. NOTE: If the format problem is too difficult to handle then you may strike this problem.
There are three clock time limits for the first problem: (a) Counting from the initialization of the problem: 1 hour. [This favours heuristics that provide answers quickly.] (b) Counting from the completion of the bundle code: 1 hour. [This favours heuristics that utilize information from the bundle code and provide more accurate solutions, but perhaps less fast.] (c) Counting from the completion of the (first) run of the bundle code: 15 minutes. [This favours heuristics that utilize information from the bundle code AND are fast.]
The final winner is decided based on all three results; the objective function(s) the examiner uses is (are) not disclosed.
On the report: Provide (1) a sufficiently good summary of the heuristics that you first tested but threw away, and the reasons why; (2) same for the heuristic that you finally settled for, including the final parameter settings; (3) for each problem tested with the final method - not only the competition ones - provide the following information: primal objective values plotted against iterations and/or CPU times, tables of best solutions found in the literature and yourselves, INCLUDING lower and upper bounds, and - not the least important - a summary explanation/interpretation of your results. [Why did your heuristic work well on some problems and not on others?]
Do not forget to hand over the final report to your opponents well before the seminar! Also, send emails with the final reports to the examiner AND bring a paper copy to him at the seminar!
Good Luck!