Theorem of the Sum of Normals

ES | EN

Let \(X_1,\,X_2,\dots,X_k\) be independent random variables with \(X_i \sim \mathcal N\!\bigl(\mu_i,\sigma_i^{2}\bigr)\). Defining the sum \(S = \sum_{i=1}^{k} X_i\) we again obtain a normal distribution:

\[ S \;\sim\; \mathcal N\!\Bigl( \underbrace{\textstyle\sum_{i=1}^{k}\mu_i}_{\mu_S}, \; \underbrace{\textstyle\sum_{i=1}^{k}\sigma_i^{2}}_{\sigma_S^{2}} \Bigr). \]

The characteristic function of each \(X_i\) is \(\varphi_{X_i}(t)= \exp\!\bigl(i\mu_i t-\tfrac12\sigma_i^{2}t^{2}\bigr)\). Because the variables are independent, \(\varphi_{S}(t) = \prod_{i=1}^{k} \varphi_{X_i}(t)\); that is,

\[ \varphi_{S}(t)= \exp\!\bigl( i\bigl(\textstyle\sum_{i}\mu_i\bigr)t \;-\; \tfrac12\bigl(\textstyle\sum_{i}\sigma_i^{2}\bigr)t^{2} \bigr), \]

which is exactly the characteristic function of a normal distribution with mean \(\mu_S\) and variance \(\sigma_S^{2}\).

Generalization with correlation

If we drop the independence assumption and collect the variables in a vector \(X \sim N(\mu,\,\Sigma)\), the sum \(S = \sum_{i=1}^{k} X_i\) is still normal, but its variance becomes

\[ \sigma_S^{2} \;=\; \mathbf1^{\mathsf T}\boldsymbol\Sigma\,\mathbf1 \;=\; \sum_{i=1}^{k}\sigma_i^{2} \;+\; 2\!\!\sum_{1\le i<j\le k} \rho_{ij}\,\sigma_i\sigma_j, \]

where \(\rho_{ij}\) are the correlation coefficients. Positive correlations increase the variance; negative ones decrease it.

ParameterValue