Grubbs' Test Formula:
From: | To: |
Grubbs' test (also called the maximum normed residual test) is a statistical test used to detect outliers in a univariate data set assumed to come from a normally distributed population. It detects one outlier at a time.
The calculator uses Grubbs' test formula:
Where:
Explanation: The test compares the calculated G value against a critical value from the t-distribution. If G exceeds the critical value, the data point is considered an outlier.
Details: Use Grubbs' test when you have a small dataset (n < 25) that is approximately normally distributed and you want to check for a single outlier.
Tips: Enter your numerical data separated by commas. The calculator will identify the most extreme value and test whether it's an outlier at your chosen significance level (default α=0.05).
Q1: What's a good significance level to use?
A: α=0.05 is common, but you might use α=0.01 for more stringent outlier detection.
Q2: Can Grubbs' test detect multiple outliers?
A: The basic test detects one outlier at a time. For multiple outliers, you would need to run the test iteratively.
Q3: What are the assumptions of Grubbs' test?
A: The data should be normally distributed (aside from potential outliers) and should contain only one outlier.
Q4: What if my data isn't normally distributed?
A: Consider using alternative methods like the generalized extreme studentized deviate (ESD) test.
Q5: How accurate is this calculator?
A: For precise results, especially with small sample sizes, consult statistical tables or specialized software.