Error Sum of Squares Formula:
From: | To: |
The Error Sum of Squares (SSE) measures the discrepancy between observed data and the values predicted by a regression model. It quantifies the total squared differences between observed (y_i) and predicted (ŷ_i) values.
The calculator uses the SSE formula:
Where:
Explanation: The equation sums the squared differences between each observed value and its corresponding predicted value.
Details: SSE is a key metric in regression analysis. Lower SSE values indicate better model fit. It's used to calculate R-squared and other goodness-of-fit statistics.
Tips: Enter matching sets of observed and predicted values. Both lists must have the same number of values. Values can be separated by commas, spaces, or line breaks.
Q1: What's the difference between SSE and MSE?
A: MSE (Mean Squared Error) is SSE divided by the number of observations. MSE gives the average squared error.
Q2: Can SSE be negative?
A: No, since it's a sum of squared terms, SSE is always ≥ 0.
Q3: What's a "good" SSE value?
A: There's no absolute threshold - lower is better. Compare SSE values between models for the same dataset.
Q4: How does SSE relate to R-squared?
A: R² = 1 - (SSE/SST), where SST is total sum of squares. R² measures proportion of variance explained.
Q5: When would I use SSE vs other error metrics?
A: SSE is most common in least squares regression. For other contexts, consider MAE (mean absolute error) or RMSE (root mean squared error).