In statistics, the explained sum of squares (ESS), alternatively known as the model sum of squares or sum of squares due to regression ("SSR" – not to be confused with the residual sum of squares RSS or sum of squares of errors), is a quantity used in describing how well a model, often a regression model, represents the data being modelled. Sum of Squares is a statistical technique used in regression analysis to determine the dispersion of data points. The sum of squares got its name because it is calculated by finding the sum of the squared differences. The general rule is that a smaller sum of squares indicates a better model, as there is less variation in the data. This image is only for illustrative purposes. In statistical data analysis the total sum of squares (TSS or SST) is a quantity that appears as part of a standard way of presenting results of such analyses. The sum of squares is one of the most important outputs in regression analysis. In particular, the explained sum of squares measures how much variation there is in the modelled values and this is compared to the total sum of squares( TSS ), which measures how … The next step is to add together all of the data and square this sum: (2 + 4 + 6 + 8) 2 = 400. That is neato. Now we will use the same set of data: 2, 4, 6, 8, with the shortcut formula to determine the sum of squares. Sum of Squares Calculator. We first square each data point and add them together: 2 2 + 4 2 + 6 2 + 8 2 = 4 + 16 + 36 + 64 = 120. For a set of observations $${\displaystyle y_{i},i\leq n}$$, it is defined as the sum over all squared differences between the observations and their overall mean $${\displaystyle {\bar {y}}}$$. In a regression analysis , the goal is … : So, you take the sum of squares \(SS\), you divide by the sample size minus 1 (\(n-1\)) and you have the sample variance. The sum of squares, or sum of squared deviation scores, is a key measure of the variability of a set of data.The mean of the sum of squares (SS) is the variance of a set of scores, and the square root of the variance is its standard deviation. The idea of sum of squares also extends to linear regression, where the regression sum of squares and the residual sum of squares determines the percentage of variation that is explained by the model.