I will try to keep a summary of things that we have done. This list is not guaranteed to be exhausive. But maybe it will be helpful. The vast majority of what I have presented is in Durrett's book. 1. Review of parts of measure theory, integration theory, various convergence definitions. 2. Statement and proof of the two Borel-Cantelli Lemmas. 3. Statement and proof of the generalized Chebyshev inequality. 4. Statement (but no proof) of pi lambda theorem and application to the fact that a prob. measure is determined by its values on a generating pi-system. 5. Statement and proof of Kolmogorov's 0-1 law for tail events. We proved this by first showing that all events involving countably many random variables can be approximated by ones which just involve finitely many. (And outlined that the pi lambda theorem gives you the latter fact). 6. Statement and proof of the Weak law of large numbers under a finite variance assumption. 7. A proof of the Stone-Weierstrass Theorem (which says that the polynomials are uniformly dense in the space of continuous functions on the interval from 0 to 1) which used Chebyshev's inequality. 8. Statement and proof of a form of the Weak law of large numbers under the assumption that the tail of the distribution decays faster than 1/x. (Used a truncation argument). 9. Gave an example showing that this assumption is strictly weaker than assuming a finite first moment. Also, stated (but did not prove) that the above tail condition was necessary for a type of WLLN. 10. Showed how the WLLN's under just a finite first moment assumption followed from the previous result by using lebesgue dominated convergence twice. 11. Proved that for the St. Petersburg game, S_n/nlog_2(n) goes to 1 in probability. 12. Quickly mentioned some surprising things about the possibility of having two fair machines in a casino but such that your winnings go to infinity a.s. 13. Stated but not proved that for the coupon collector problem, T_n/nlog(n) approaches 1 in probability. T_n is the time to obtain call n coupons. This illustrated why one should consider triangular arrays. 14. Statement and proof of Strong law of large numbers (SLLN) under a finite 4th moment condition. 15. Statement and proof of SLLN under a finite 2nd moment condition. 16. Kolmog. inequality. 17. Random series. Sum of independent mean 0 RVs converge if the sum of the variances converge. 18. Statement of Kronecker's lemma. Proof not done; referred to Durrett. 19. Showed how the Random Series Theorem and Kronecker's lemma easily give the SLLN under a finite second moment assumption. 20. Statement and proof of SLLN under a finite 1st moment condition. 5 steps to the proof. 21. For mean 0 finite variance iid, we showed (again using the Random Series Theorem and Kronecker's lemma) that S_n divided by sqrt(n (log n)^(1+eps)) converges to 0 a.s., a big strenghtening of the SLLN. Then we stated the Law of the Iterated Logarithm. 22. Introduced vague convergence of subprobability measures and convergence in distribution. Gave 4 or 5 equivalent definitions of these things. Stated the Central Limit Theorem. Stated Helly's Theorem which is the sequential compactness of the set of all subprobability measures under vague convergence. Stated and proved that tightness of a collection of probability measures is equivalent to having every convergent subsequence converging to a probability measure. 23. Discuss some concrete examples of convergence of distribution. The birthday question, the maximum of n exponentials and arc sign laws for coin tossing. 24. Introduce the characteristic function of a random variable (aka Fourier transform) and explained some of its basic properties. 25. The inversion theorem for characteristic functions which implies that a dist. is uniquely determined by its c.f. 26. If the c.f. is integrable (i.e. in L1), then the distribution function is continuously differentiable which means the distribution function has a continuous density. 27. If U is uniform on [-1,1], its c.f. is sin (t)/t which is not in L1. The above theorem said it could not be in L1 since the pdf is discontinuous. 28. Relating the structure of the atoms to the c.f. 29. State and prove that if we have convergence in distribution, then the characteristic functions converge pointwise. 30. State and prove the main theorem characterizing convergence in distribution in terms of convergence of cf's (Converse of 29 above). If f_n is the cf for X_n and if f_infty is the cf for X_infty, then if f_n converges to f_\infty, it follows that X_n converges to X_infty in distribution. 31. In fact, we prove something stronger than 30. If a sequence of cf's converge pointwise to a function which is continuous at 0 then the limit has to be a cf of some RV and one then has convergence in distribution. 32. A key step for 31 is relating the tail behavior of the distribution with the behavior of its c.f. near 0. This will insure tightness. 33. State that if $X$ has a finite kth moment, k is an integer, then the c.f. is k times continuously differentiable. The proof (which we didn't do) is exchanging the order of integration and differentiation which is justified by the Lebesgue dominated convergence theorem (|x|^k will be the dominating function used). 34. State a particular version of Taylors Theorem with remainder term. 35. Reprove the weak law of large numbers using characteristic functions. 36. Compute the cf of the standard normal. 37. State and prove the central limit theorem using characteristic functions and 30 above. 38. Proved that there are three RVs X,Y,Z such that Y and Z have different distributions but X+Y and X+Z have the same distribution. In the sums, the two random variables are taken independently. This was proved by cfs, the first step was to show that tent function is a cf. THINGS TO COME 39. Statement of the Lindeberg-Feller Theorem without proof. Show it implies the CLT and an example. 40. Definition of Brownian motion. Nondifferentiability of paths and some other interesting properties.