Grubbs’ Test

Grubbs’ test algorithm calculates the ratio of the deviation of each data point from the mean of the data set to the standard deviation of the data set. The basic formula is as follows:

\[G = \frac{\text{max}|(Y_i-\mu)|}{\sigma}\]

where \(G\) is the Grubbs’ test statistic, \(Y_i\) is the \(i\)th component of \(Y\), \(\mu\) is the mean value, and \(\sigma\) is the standard deviation of the data set. If

\[G > G^{\text{table}} \, \text{,}\]

then the data point can be considered outside the region of interest. \(G^{\text{table}}\) includes literature values for a certain significance level.

Input Parameters

Parameter Type Constraint Description Remarks
\(Y\) \(Y \in \mathbb R^N\) \(N \in \mathbb{N}\) Input data sequence of length \(N\) Not effective for a data vector with less than \(6\) constituents.
\(\mu\) \(\mu \in \mathbb{R}\) None Mean distribution of \(Y\) None
\(\sigma\) \(\sigma\in \mathbb{R}\) None Standard deviation of \(Y\) None
\(G^{\text{table}}\) None None Grubbs’ test values from literature None

Output Parameters

Parameter Type Constraint Description Remarks
\(\hat{Y}\) \(\hat{Y} \in \mathbb R^N\) None Values in the \(Y\) list which are outside the region of interest are marked None

Single Steps using the Algorithm

References

    1. Grubbs, Procedures for Detecting Outlying Observations in Samples, Technometrics, vol. 11(1), pp. 1-21, 1969.