Very often you have some norms and you want to compare them.

Definition. Let \(f:\left\{ -1,1\right\}\to \mathbb{R}\). Define the p-norm as \[ ||f||_p = \left(\mathop{\mathrm{\mathbb{E}}}_x |f(x)|^{p}\right)^{1/p}. \]

Example. \[ ||f||_1 = \mathop{\mathrm{\mathbb{E}}}_x |f(x)|, \] \[ ||f||_\infty = \sup_x |f(x)|. \]

It is easy to see that for all \(p<q\) we have \[ ||f||_p \le ||f||_q. \]

But what if you wanted the inequality to point the other way? It turns out that for nice enough functions you can get something kind of like this.

Intuition: If \(f = \sum_i a_i x_i\), i.e., a centered degree-1 function, then \(f(x)\) for random \(x\) in the hypercube should by tightly concentrated around zero.

Theorem. If \(f\) is degree at most \(d\) then we have

\[ ||f||_q \le \sqrt{q-1}^{d} ||f||_2. \]

Proof. Induction and Cauchy-Shwarz

Remark. There is another interpretation of this in terms of the noisy hypercube.

Noisy hypercube is a Markov Chain. Starting from a vertex for each coordinate independently you either stay or resample the value uniformly randomly.

You can think of this as a weighted graph too.

Or as a linear operator that does smoothing.

You can prove that the smoothed version of \(f\) with this noisy thing has \[ ||T_\rho f||_q \le ||f||_p \] for \(1\le p\le q\le \infty\) and \(0\le \rho \le \sqrt{\frac{p-1}{q-1}}\).

Remark. There is an interpretation of this in terms of small-set-expanders.

I think it says that if you think about \(T_\rho\) as a weighted grpah then it is a small-set-expander.

Remark. There was some kind of fancy chernoff-like bound based on this.