Lecture 1 Introduced Boolean functions and gave 6 classical examples. Defined pivotality and influence. More of less sections 1-4 (and some of 5) in Chapter 1 of GS without proofs. Stated/showed the majority function has influences const/sqrt(n). Stated a lower bound on the max. influence is const/n. Stated tribes gives const log n/n for max. influence. Stated the big KKL result which says that const log n/n for max. influence is also a lower bound. Defined noise sensitivity and deterministic complexity but that was just overview and I will repeat that when we get to that. Lecture 2 1. Proof of the const/n lower bound on max. influence for "nondegenerate" Boolean functions. This is Th. I.13 in GS which is called a discrete poincare inequality since it is similar to the poincare inequality in analysis. 2. Show the tribes function gives us an example whose max. influence is constant log n/n. 3. Show any Boolean function is expressible as a sum of multilinear functions, which will be essentially a Fourier (or Fourier-Walsh) expansion. See section 1 of Chapter IV in GS and sections 1.2-1.4 in OD. This was first shown on purely algebraic grounds where the characters were shown to be a basis. And then again, when we put the natural probability measure on the domain and the characters became an orthonormal basis. 4. We showed various formulas involving the Fourier coefficients of a Boolean function, including that the expected squared mean of any real-valued function is the sum of the squares of the Fourier coefficients. 5. Began to relate the notion of influence with the behavior of the Fourier expansion. Section 4 of Chapter 4 in GS. Lecture 3. Section 4 of Chapter 4 in GS. Representing the influence of a variable and the total influence in terms of the Fourier coefficients. The statement and some explanation for the hypercontractivity theorem. Theorem V.2 in Chapter 5, Section 2 in GS. The proof of it in in Chapter 5, Section 6 which I left left to you. Proof of the KKL theorem. It is stated in chapter 1, section 4 in GS and the proof is in Chapter 5, section 3.1 in GS. Lecture 4. Statement without proof that there are subsets of size n/logn whose influence is close to 1. (This is not described in either book). We proved that for monotone functions going into 1,-1, the ith influence is the Fourier coefficient of the set which just has i in it. As a consequence, we showed that the total influece of a monotone function is at most sqrt(n). This is Chapter 4, sections 4-5 in GS. We stated and proved the margulis-Russo Theorem. We stated the sharp threshold result of Friedgut and Kalai. This is sections 3.1-3.3 in GS and suggested why it might be true. Lecture 5. We proved the "threshold theorem" (All of chapter 3 except section 4 in GS). Move into noise sensitivity, recalling the definitions and defining the "opposite" of noise sensitivity, namely noise stability. We proved with a recursive argument involving 1-d dynamical systems that iterated 3-majority is noise sensitive Lecture 6. We will prove (ordinary) majority is noise stable. The only things from probability you will need is Chebyshev's inequality and the central limit theorem (in the simplest possible setting of coin flips). We we then describe noise sensitivity in terms of the spectral measure (Fourier weight). One is N. Sens. iff the spectral measure "goes to infinity". This is Chapter 4, section 3 in GS. Here it is natural to introduce the "spectral sample" more formally. Tuesday's lecture was cancelled. Lecture 7. We will do (Chapter 5, section 5 in GS) which is considered the main theorem of noise sensitivity which says that if the sum of the squares of the influences (equivalently the ell_2 norm squared of the influence vector) goes to 0, then we have noise sens. We prove this under a certain extra condition. This can be applied immediately to show that Iterated 3-majority and tribes is noise sensitive. The converse holds for monotone functions but otherwise not (parity). Introduce the concept of property testing. We will just do one result in this area (the BLR test for linearity testing) which is Section 1.6 in Odonnell. Lecture 8 Finishing the proof of the BLR test. Only thing left is the description of the P(accepts) in terms of the Fourier coefficients. Page 16 in OD. OD Theorem 2.33. For Majority, we will describe the influences, total influences and how much weight is on the first level keeping track of constants. Prop 2.58 OD. Some discussion concerning the exact "limiting Fourier picture" for majority. (See (5.10) on page 109 (122?) in OD). We did a little bit of Section 2.4 (OD). Lecture 9 All of Section 2.5 (OD), which is Arrow's celebrated impossibility theorem. Lecture 10 We will now move into the last part of the course, which concerns various things about randomized algorithms and connections with noise sensitivity. This will be, for the most part, 12.10 in GS (with perhaps some supplementary things and perhaps not doing everything). Lecture 11 We proved the first part of Theorem VIII.8 in GS, Odonnell Servedio result. We then proved that every transitive Boolean function for which f(1) is different than f(-1) for which is the number of bits d is a prime power is evasive, meaning that its deterministic complexity is d. Lecture 12 Will continue with section 12.10 in GS together with various things from Chapter 8 in GS. Coming up Lecture 13 Will continue with section 12.10 in GS together with various things from Chapter 8 in GS AND will finish this up. Lecture 14 (last lecture). I will give an overview of what is known about noise sensitivity and influences in the important context of percolation.