## Discrete Fourier Analysis

2022 May 17

 O'Donnell, Ryan. Analysis of boolean functions. Cambridge University Press, 2014.

 Filmus, Yuval. "Analysis of Boolean functions." (2016).

 De Wolf, Ronald. "A brief introduction to Fourier analysis on the Boolean cube." Theory of Computing (2008): 1-20.

 is short, very concise, neat and contains pointers to other materials.

Fourier basis are orthonormal.

Fourier transform is a linear map.

Fourier basis are the only linear functions. Linearity tests are essentially testing how close a function is to a Fourier basis. The measure for this closeness or distance can be statistical distance, inner product, 2-norm, etc.

For a function, the set of heavy Fourier coefficients is small. (\epsilon then 1/(\epsilon^2))

Convolution is bilinear.

XOR lemma: the bias of the XOR of independent distributions over {0,1} is the product of individual biases, if tested with character S=1. (Always 1 if tested with S=0)

Min-entropy source gives small collision probability.

Min-entropy extractor: Given a min-entropy source X (or in essence, any source with small collision probability) and a small-bias distribution Y, then the statistical distance between (X\xor Y) and uniform is small.

Leftover hash lemma (LHL): first, we establish that a single deterministic hash function cannot extract random bits from all high min-entropy sources. A simple thought experiment is to fix a hash function, we can always construct a high min-entropy source that makes this function output a constant. Then we can use a family of hash functions, or more precisely, universal hash functions. LHL states that this family of universal hash functions can extract randomness from all min-entropy sources.

Noise operator is a linear map.

Norm has monotonicity, meaning that for 1<=p<q, p-norm is no greater than q-norm.

Norm has contractivity, meaning that for p>=1, p-norm on the smoothened (run through noise operator) version of a function is no greater than the p-norm on the original function.

Norm has hypercontractivity, meaning that for 1<=p<q, q-norm on the smoothened version of a function is no greater than the p-norm on the original function. Here the noise operator has a specific noise parameter. [Theorem 4.1 in ]

KKL theorem as a special case for hypercontractivity, with p=1+\delta, q=2.

Friedgut: if a Boolean function has total influence \t, then it's close to a 2^{O(\t)}-junta Boolean function (meaning that this function only depends on 2^{O(\t)} of the coordinates/variables). The proof is to construct the junta function. The bound is tight because for address/tribe function, this many variables are needed.

Bourgain: for a balanced Boolean function (equal number of inputs resulting in output 0 and 1), the weight on higher degree Fourier coefficients cannot decay too fast unless it is close to a junta function.

Goldreich-Levin hardcore predicate: we can construct PRG from OWF. Proof by contradiction: if the construction is not secure, then we can invert the OWF with Goldreich-Levin algorithm.

Overall, Fourier analysis tries to capture the properties of the original function with properties of Fourier coefficients. The applications, as also mentioned in , include property testing, PCP, explaining threshold phenomena via analysis of influences and juntas, an alternative proof for Arrow's theorem, verification for ML algorithms (t-sparse Boolean functions), etc.