Now I’m going to introduce something you see a lot of in probability and statistics, as well as other subjects: the normal distribution. Sometimes this is also called a Gaussian distribution. Was Gauss normal? About as normal as any dude can be expected to be who invents or makes seminal contributions to potential theory, hypergeometric functions, celestial mechanic, differential equations, conic sections, and magnetism. And that was before lunch. Anyway, one of the things he’s best known for is the Gaussian distribution:

$$\colorbox[rgb]{1,1,1}{$p$}\colorbox[rgb]{1,1,1}{$($}\colorbox[rgb]{1,1,1}{$x$}\colorbox[rgb]{1,1,1}{$)$}\colorbox[rgb]{1,1,1}{$=$}\frac{\colorbox[rgb]{1,1,1}{$1$}}{\sqrt{\colorbox[rgb]{1,1,1}{$2$}\colorbox[rgb]{1,1,1}{$\pi $}{\colorbox[rgb]{1,1,1}{$\sigma $}}^{\colorbox[rgb]{1,1,1}{$2$}}}}{\colorbox[rgb]{1,1,1}{$e$}}^{\frac{{\colorbox[rgb]{1,1,1}{$($}\colorbox[rgb]{1,1,1}{$x$}\colorbox[rgb]{1,1,1}{$-$}\overline{\colorbox[rgb]{1,1,1}{$x$}}\colorbox[rgb]{1,1,1}{$)$}}^{\colorbox[rgb]{1,1,1}{$2$}}}{\colorbox[rgb]{1,1,1}{$2$}{\colorbox[rgb]{1,1,1}{$\sigma $}}^{\colorbox[rgb]{1,1,1}{$2$}}}}$$ | (1.56) |

The ”standard normal distribution” (does someone have a normality complex here or what?) is the same thing but with $\colorbox[rgb]{1,1,1}{$\sigma $}\colorbox[rgb]{1,1,1}{$=$}\colorbox[rgb]{1,1,1}{$1$}$ ad $\overline{\colorbox[rgb]{1,1,1}{$x$}}\colorbox[rgb]{1,1,1}{$=$}\colorbox[rgb]{1,1,1}{$0$}$. But what about this odd notation ${\colorbox[rgb]{1,1,1}{$\sigma $}}^{\colorbox[rgb]{1,1,1}{$2$}}$ and $\overline{\colorbox[rgb]{1,1,1}{$x$}}$? They’re just constants in the above equation, but if you calculate $\colorbox[rgb]{1,1,1}{$\u27e8$}\colorbox[rgb]{1,1,1}{$x$}\colorbox[rgb]{1,1,1}{$\u27e9$}$ using this distribution and eqn 1.17 then it is just $\overline{\colorbox[rgb]{1,1,1}{$x$}}$. It’s easy to show this by symmetry. How about ${\colorbox[rgb]{1,1,1}{$\sigma $}}^{\colorbox[rgb]{1,1,1}{$2$}}$. Well if calculate the variance of this distribution using eqn. 1.40 with this distribution you get ${\colorbox[rgb]{1,1,1}{$\sigma $}}^{\colorbox[rgb]{1,1,1}{$2$}}$. That can be done with another bad ass differentiation trick that I’m not going to do but you can play with if you like.

Another important fact is this definition is normalized correctly. We know, section 1.12, that the area under $\colorbox[rgb]{1,1,1}{$p$}\colorbox[rgb]{1,1,1}{$($}\colorbox[rgb]{1,1,1}{$x$}\colorbox[rgb]{1,1,1}{$)$}$ must be 1, and if you do the integral, that’s what you get. That is the reason there’s a prefactor of $\colorbox[rgb]{1,1,1}{$1$}\colorbox[rgb]{1,1,1}{$/$}\sqrt{\colorbox[rgb]{1,1,1}{$2$}\colorbox[rgb]{1,1,1}{$\pi $}{\colorbox[rgb]{1,1,1}{$\sigma $}}^{\colorbox[rgb]{1,1,1}{$2$}}}$ in the definition.

How does this distribution look? Like a brontosaurus.

In fact we’ve plotted it before in fig. 1.4. In this case the average is $\colorbox[rgb]{1,1,1}{$6$}$ and the variance is $\colorbox[rgb]{1,1,1}{$1$}$. If you change the average, the distribution will shift. If you increase the variance, the distribution will get fatter, but it always has this shape.