There are two important properties of the standard error of the mean (SE). The first is that even if the distribution of the data is not normal, the distribution of the sample means about the population mean tends to be. The larger the number of samples, the more closely the distribution becomes a normal distribution. Second, the more subjects within each sample, the better the chance that the observed sample mean will be close to the true population mean, so the distribution will get narrower (i.e. the SE is smaller).

The actual relationship betwee the SE, the SD of the data sample and the number of subjects in the sample is:

SE = SD/ √n

Exactly how we determine this equation is beyond the scope of this primer, but in broad terms, we return to the more fundamental measure of variance. We described that the variance is a summation of squares, and as such has some peculiar mathematical properties. Those that are relevant to us are:

- The variance (V) of the sum of a number of variables (a, b, c) is equal to the sum of the variances of the individual variables:

V(a+b+c) = Va + Vb + Vc

- The variance of the
*difference*between two variables is also equal to the sum of the variances of the individual variables. One can readily conceptualise that combining two variables will increase the overall variability, whether they are added or subtracted:

V(a-b) = Va + Vb

- The variance of a linear equation operates so that the variance of a variable, a, multiplied by a constant, m, is equal to the constant
*squared*multiplied by the variance of the variable:

m^{2}V(a) = V(ma)

So to determine the variance (V) of a group of sample means, each of which has a parameter variance, we first consider that a mean is the sum divided by the number of parameters in each sample (n), and the variance of the means will therefore be the variance of the sums of the parameters divided by the number of parameters in the sample:

V = variance of (sums/n)

From the third property above, if we take n out of the variance of sums term:

V = 1/n^{2} . (variance of sums)

The first property of variance tells us that the variance of the sums is equal to the sum of the variances of the parameters in each sample. If we assume that the parameter variances in all the samples are equal, then the variance of the sums equals n times the variance of the parameter:

V = 1/n^{2} . n . (variance of parameter)

V = 1/n . variance of parameter

Taking the square root to get the standard error:

SE = SD/ √n