Theory - [T6]
Statistical indipendence
Statistical independence is a concept in probability theory. Two events A and B are statistical independent if and only if their joint probability can be factorized into their marginal probabilities: $$ P(A \cap B) = P(A)P(B) $$ If two events A and B are statistical independent, then the conditional probability equals the marginal probability: $$ P(A \land B) = P(A) and P(B \land A) = P(B) $$ In simple words this means that the occurence of an event doesn’t affect the probability of the occurence of the other.
Relative joint frequencies are equal to the products of the corresponding marginal frequencies.
Consider the following matrix:
X\Y | $Y_1$ | … | $Y_j$ | … | $Y_c$ |
---|---|---|---|---|---|
$X_1$ | $n{1,1}$ | … | $n{1,j}$ | … | $n{1,c}$ |
… | … | … | … | … | … |
$X_i$ | $n{i,1}$ | … | $n{i,j}$ | … | $n{i,c}$ |
… | … | … | … | … | … |
$X_r$ | $n{r,1}$ | … | $n{r,j}$ | … | $n{r,c}$ |
As stated previously given statistical indipendence the conditional probability equals the marginal probability, this can be written as: $$ {\frac {n_{i,j}}{n_{i,\cdot}} } = {\frac {n_{\cdot,j}} {n} } $$ This represent the conditional distribution in respect of the marginal of $X$. This simple identity can be further developed $$ n_{i,j} \cdot n = n_{\cdot,j} \cdot n_{i,\cdot} $$ Dividing now every member by $2n$ $$ \frac {n_{i,j} \cdot \cancel{n}} {n \cdot \cancel{n}} = \frac {n_{\cdot,j}}{n} \cdot \frac {n_{i,\cdot}}{n} $$ The first member of the equation $\frac {n_{i,j}} {n}$ represents indeed the joint frequency of $X_i$ and $Y_j$ and the second member is the product of the corresponding marginal frequencies, as expected.