The mean square error (MSE) is the average of the squared errors between actual and estimated readings in a data sample. Squaring the difference removes the possibility of dealing with negative numbers. It also gives bigger differences more weight than smaller differences in the result. Mean square error is widely used in signal processing applications, such as assessing signal quality, comparing competing signal processing methods and optimising signal processing algorithms.

Find the difference between the actual and estimated data points in a sample. For example, if you have developed an algorithm for predicting stock prices, the difference between the predicted stock price and the actual price would be the error. If your algorithm predicts £7, £9, £13, £14 and £15 as prices for five stocks on a particular day, and the actual prices are £8, £11, £11, £13 and £15, respectively, then the errors are 60p ($13 - £7), £1.30 ($17 - £9), -$2 ($18 - £13), -$2 ($20 - £14) and zero (£15 - $24), respectively.

Compute the sum of the square of the errors. First, square the differences, and then add them up. Continuing with the example, the sum of the square of the errors is 13 (1 + 4 + 4 + 4 + 0).

Divide the sum of the square of the errors by the number of data points to calculate the mean square error. To conclude the example, the mean square error is equal to 2.6 (13 / 5).