ICC Formula:
From: | To: |
The Intraclass Correlation Coefficient (ICC) measures the reliability of ratings or measurements for clustered data. It assesses how strongly units in the same group resemble each other.
The calculator uses the ICC formula:
Where:
Explanation: The equation compares the between-group variance to the total variance, with adjustment for the number of raters.
Details: ICC is crucial for assessing inter-rater reliability in research studies, clinical assessments, and any situation requiring consistency between multiple raters or measurements.
Tips: Enter the mean square between groups, mean square error, and number of raters. All values must be positive numbers with at least 2 raters.
Q1: What is a good ICC value?
A: ICC > 0.75 indicates excellent reliability, 0.60-0.74 good, 0.40-0.59 fair, and <0.40 poor reliability.
Q2: What's the difference between ICC and Pearson correlation?
A: ICC assesses absolute agreement while Pearson assesses linear relationship. ICC is more appropriate for reliability studies.
Q3: Which ICC formula should I use?
A: This calculator uses ICC(1,1) or ICC(2,1) depending on your study design. Consult a statistician for complex designs.
Q4: Can ICC be negative?
A: Yes, but it indicates more variation within subjects than between them, suggesting very poor reliability.
Q5: How many raters do I need?
A: Typically 2-5 raters are used. More raters increase reliability but also study complexity and cost.