\( \newcommand{\matr}[1] {\mathbf{#1}} \newcommand{\vertbar} {\rule[-1ex]{0.5pt}{2.5ex}} \newcommand{\horzbar} {\rule[.5ex]{2.5ex}{0.5pt}} \)
deepdream of
          a sidewalk
Show Question
\( \newcommand{\cat}[1] {\mathrm{#1}} \newcommand{\catobj}[1] {\operatorname{Obj}(\mathrm{#1})} \newcommand{\cathom}[1] {\operatorname{Hom}_{\cat{#1}}} \newcommand{\multiBetaReduction}[0] {\twoheadrightarrow_{\beta}} \newcommand{\betaReduction}[0] {\rightarrow_{\beta}} \newcommand{\betaEq}[0] {=_{\beta}} \newcommand{\string}[1] {\texttt{"}\mathtt{#1}\texttt{"}} \newcommand{\symbolq}[1] {\texttt{`}\mathtt{#1}\texttt{'}} \)
Math and science::INF ML AI

Chi-squared distribution

David MacKay motivates (in passing) the chi-squared distribution through exercise 2.5.

The solution is on the reverse side. Dummy cloze



Recognize that \( z \) is a function of the of a random variable \( n_B \) which is in turn a sum of independent Bernouilli random variables. \( z \) can be thought of as trying to measure how many standard devations away from the mean \( n_B \) is. 

The official chi-squared distribution is specific to the case where \( n_B \) is normally distributed with some mean and variance.