Gaussian random variables are wonderful, and there are lots of clever tricks for doing computations with them. One particularly nice tool is the Gaussian integration by parts formula, which I learned from my PhD advisor Joel Tropp. Here it is:
Gaussian integration by parts. Let
be a standard Gaussian random variable. Then
.
This formula makes many basic computations effortless. For instance, to compute the second moment of a standard Gaussian random variable
, we apply the formula with
to obtain

The fourth moment is no harder to compute. Using , we compute
Iterating this trick, we can compute all the even moments of a standard Gaussian random variable. Indeed,




As a spicier application, let us now compute . To do so, we choose
to be the sign function:


![Rendered by QuickLaTeX.com \expect[\delta(z)]](https://www.ethanepperly.com/wp-content/ql-cache/quicklatex.com-98af0e38be58f25ca323e09012b2d699_l3.png)

Application: Power Method from a Random Start
As an application of the Gaussian integration by parts formula, we can analyze the famous power method for eigenvalue computations with a (Gaussian) random initialization. This discussion is adapted from the tutorial of Kireeva and Tropp (2024).
Setup
Before we can get to the cool application of the Gaussian integration by parts formula, we need to setup the problem and do a bit of algebra. Let be a matrix, which we’ll assume for simplicity to be symmetric and positive semidefinite. Let
denote the eigenvalues of
. We assume the largest eigenvalue
is strictly larger than the next eigenvalue
.
The power method computes the largest eigenvalue of by repeating the iteration
. After many iterations,
approaches an eigenvector of
and
approaches an eigenvalue. Letting
denote the initial vector, the
th power iterate is
and the
th eigenvalue estimate is
It is common to initialize the power method with a vector with (independent) standard Gaussian random coordinates. In this case, the components
of
in an eigenvector basis of
are also independent standard Gaussians, owing to the rotational invariance of the (standard multivariate) Gaussian distribution. Then the
th eigenvalue estimate is


Analysis
Having set everything up, we can now use the Gaussian integration by parts formula to make quick work of the analysis. To begin, observe that the quantity







Now, let us bound the expected value of the error. First, we take an expectation with respect to only the randomness in the first Gaussian variable
. Here, we use Gaussian integration by parts in the reverse direction. Introduce the function

![Rendered by QuickLaTeX.com \expect[|z|] = \sqrt{2/\pi}](https://www.ethanepperly.com/wp-content/ql-cache/quicklatex.com-83e4662db4b166e137ef7686345f323f_l3.png)

We’re in the home stretch! We can bound as





![Rendered by QuickLaTeX.com \expect[c] \le \lambda_2^t \sqrt{n-1}](https://www.ethanepperly.com/wp-content/ql-cache/quicklatex.com-21c0dcda1a066b56b84b3b209bb3f070_l3.png)

The first analyses of power iteration from a random start were done by Kuczyński and Woźniakowski (1992) and require pages of detailed computations involving integrals. This simplified analysis, due to Tropp (2020), makes the analysis effortless by comparison.