R Squared Formula

5/5 - (1 bình chọn)

Coefficient of Determination, R-squared


The coefficient of determination, or R2, is a measure that provides information about the goodness of fit of a model. In the context of regression it is a statistical measure of how well the regression line approximates the actual data. It is therefore important when a statistical model is used either to predict future outcomes or in the testing of hypotheses. There are a number of variants (see comment below); the one presented here is widely used

The sum squared regression is the sum of the residuals squared, and the total sum of squares is the sum of the distance the data is away from the mean all squared. As it is a percentage it will take values between 0 and 1.

nterpretation of the R2


Here are a few examples of interpreting the R2 value:


Worked Example

Worked Example

Below is a graph showing how the number lectures per day affects the number of hours spent at university per day. The equation of the regression line is drawn on the graph and it has equation


To calculate R2

you need to find the sum of the residuals squared and the total sum of squares.

Start off by finding the residuals, which is the distance from regression line to each data point. Work out the predicted y value by plugging in the corresponding x value into the regression line equation.

As you can see from the graph the actual point is below the regression line, so it makes sense that the residual is negative.

As you can see from the graph the actual point is above the regression line, so it makes sense that the residual is positive.


This means that the number of lectures per day account for 89.5

% of the variation in the hours people spend at university per day.

An odd property of R2 is that it is increasing with the number of variables. Thus, in the example above, if we added another variable measuring mean height of lecturers, R2 would be no lower and may well, by chance, be greater – even though this is unlikely to be an improvement in the model. To account for this, an adjusted version of the coefficient of determination is sometimes used.

R-Squared Formula, Regression, and Interpretations

What Is R-Squared?

R-squared (R2) is a statistical measure that represents the proportion of the variance for a dependent variable that’s explained by an independent variable or variables in a regression model. Whereas correlation explains the strength of the relationship between an independent and dependent variable, R-squared explains to what extent the variance of one variable explains the variance of the second variable. So, if the R2 of a model is 0.50, then approximately half of the observed variation can be explained by the model’s inputs.

Figure 1. Regression output in MS Excel

R-squared can take any values between 0 to 1. Although the statistical measure provides some useful insights regarding the regression model, the user should not rely only on the measure in the assessment of a statistical model. The figure does not disclose information about the causation relationship between the independent and dependent variables.

In addition, it does not indicate the correctness of the regression model. Therefore, the user should always draw conclusions about the model by analyzing r-squared together with the other variables in a statistical model.

Formula for R-Squared

The actual calculation of R-squared requires several steps. This includes taking the data points (observations) of dependent and independent variables and finding the line of best fit, often from a regression model. From there you would calculate predicted values, subtract actual values and square the results. This yields a list of errors squared, which is then summed and equals the unexplained variance.

To calculate the total variance, you would subtract the average actual value from each of the actual values, square the results and sum them. From there, divide the first sum of errors (explained variance) by the second sum (total variance), subtract the result from one, and you have the R-squared.

What R-Squared Can Tell You

In investing, R-squared is generally interpreted as the percentage of a fund or security’s movements that can be explained by movements in a benchmark index. For example, an R-squared for a fixed-income security versus a bond index identifies the security’s proportion of price movement that is predictable based on a price movement of the index.

The same can be applied to a stock versus the S&P 500 index, or any other relevant index. It may also be known as the coefficient of determination.

R-squared values range from 0 to 1 and are commonly stated as percentages from 0% to 100%. An R-squared of 100% means that all movements of a security (or another dependent variable) are completely explained by movements in the index (or the independent variable(s) you are interested in).

In investing, a high R-squared, between 85% and 100%, indicates the stock or fund’s performance moves relatively in line with the index. A fund with a low R-squared, at 70% or less, indicates the security does not generally follow the movements of the index. A higher R-squared value will indicate a more useful beta figure. For example, if a stock or fund has an R-squared value of close to 100%, but has a beta below 1, it is most likely offering higher risk-adjusted returns.

R-Squared vs. Adjusted R-Squared

R-Squared only works as intended in a simple linear regression model with one explanatory variable. With a multiple regression made up of several independent variables, the R-Squared must be adjusted.

The adjusted R-squared compares the descriptive power of regression models that include diverse numbers of predictors. Every predictor added to a model increases R-squared and never decreases it. Thus, a model with more terms may seem to have a better fit just for the fact that it has more terms, while the adjusted R-squared compensates for the addition of variables and only increases if the new term enhances the model above what would be obtained by probability and decreases when a predictor enhances the model less than what is predicted by chance.

In an overfitting condition, an incorrectly high value of R-squared is obtained, even when the model actually has a decreased ability to predict. This is not the case with the adjusted R-squared.

R-Squared vs. Beta

Beta and R-squared are two related, but different, measures of correlation but the beta is a measure of relative riskiness. A mutual fund with a high R-squared correlates highly with a benchmark. If the beta is also high, it may produce higher returns than the benchmark, particularly in bull markets. R-squared measures how closely each change in the price of an asset is correlated to a benchmark.

Beta measures how large those price changes are relative to a benchmark. Used together, R-squared and beta give investors a thorough picture of the performance of asset managers. A beta of exactly 1.0 means that the risk (volatility) of the asset is identical to that of its benchmark. Essentially, R-squared is a statistical analysis technique for the practical use and trustworthiness of betas of securities.

Limitations of R-Squared

R-squared will give you an estimate of the relationship between movements of a dependent variable based on an independent variable’s movements. It doesn’t tell you whether your chosen model is good or bad, nor will it tell you whether the data and predictions are biased. A high or low R-square isn’t necessarily good or bad, as it doesn’t convey the reliability of the model, nor whether you’ve chosen the right regression. You can get a low R-squared for a good model, or a high R-square for a poorly fitted model, and vice versa.

What Is a Good R-Squared Value?

What qualifies as a “good” R-Squared value will depend on the context. In some fields, such as the social sciences, even a relatively low R-Squared such as 0.5 could be considered relatively strong. In other fields, the standards for a good R-Squared reading can be much higher, such as 0.9 or above. In finance, an R-Squared above 0.7 would generally be seen as showing a high level of correlation, whereas a measure below 0.4 would show a low correlation. This is not a hard rule, however, and will depend on the specific analysis.

What Does an R-Squared Value of 0.9 Mean?

Essentially, an R-Squared value of 0.9 would indicate that 90% of the variance of the dependent variable being studied is explained by the variance of the independent variable. For instance, if a mutual fund has an R-Squared value of 0.9 relative to its benchmark, that would indicate that 90% of the variance of the fund is explained by the variance of its benchmark index.

Is a Higher R-Squared Better?

Here again, it depends on the context. Suppose you are searching for an index fund that will track a specific index as closely as possible. In that scenario, you would want the fund’s R-Squared to be as high as possible since its goal is to match—rather than exceed—the index. If on the other hand, you are looking for actively managed funds, a high R-Squared might be seen as a bad sign, indicating that the funds’ managers are not adding sufficient value relative to their benchmarks.

Finding R Squared / The Coefficient of Determination

Step 1: Find the correlation coefficient, r (it may be given to you in the question). Example, r = 0.543.

Step 2: Square the correlation coefficient.
0.5432 = .295

Step 3: Convert the correlation coefficient to a percentage.
.295 = 29.5%
That’s it!

Usefulness of R2

The usefulness of R2 is its ability to find the likelihood of future events falling within the predicted outcomes. The idea is that if more samples are added, the coefficient would show the probability of a new point falling on the line.
Even if there is a strong connection between the two variables, determination does not prove causality. For example, a study on birthdays may show a large number of birthdays happen within a time frame of one or two months. This does not mean that the passage of time or the change of seasons causes pregnancy.

Interpretation of R-Squared

The most common interpretation of r-squared is how well the regression model explains observed data. For example, an r-squared of 60% reveals that 60% of the variability observed in the target variable is explained by the regression model. Generally, a higher r-squared indicates more variability is explained by the model.

However, it is not always the case that a high r-squared is good for the regression model. The quality of the statistical measure depends on many factors, such as the nature of the variables employed in the model, the units of measure of the variables, and the applied data transformation. Thus, sometimes, a high r-squared can indicate the problems with the regression model.

A low r-squared figure is generally a bad sign for predictive models. However, in some cases, a good model may show a small value.

There is no universal rule on how to incorporate the statistical measure in assessing a model. The context of the experiment or forecast is extremely important, and, in different scenarios, the insights from the metric can vary.

How to Calculate R-Squared

The formula for calculating R-squared is:


  • SSregression is the sum of squares due to regression (explained sum of squares)
  • SStotal is the total sum of squares

Although the names “sum of squares due to regression” and “total sum of squares” may seem confusing, the meanings of the variables are straightforward.

The sum of squares due to regression measures how well the regression model represents the data used for modeling. The total sum of squares measures the variation in the observed data (data used in regression modeling).

How to Calculate R-Squared by Hand

In statistics, R-squared (R2) measures the proportion of the variance in the response variable that can be explained by the predictor variable in a regression model.

We use the following formula to calculate R-squared:

The following step-by-step example shows how to calculate R-squared by hand for a given regression model.

Step 1: Create a Dataset

First, let’s create a dataset:

Step 2: Calculate Necessary Metrics

Next, let’s calculate each metric that we need to use in the R2 formula:

Step 3: Calculate R-Squared

Lastly, we’ll plug in each metric into the formula for R2:

Note: The n in the formula represents the number of observations in the dataset and turns out to be n = 8 observations in this example.

Assuming x is the predictor variable and y is the response variable in this regression model, the R-squared for the model is 0.6686.

This tells us that 66.86% of the variation in the variable y can be explained by variable x.

What is R Squared (R2) in Regression?

R-squared (R2) is an important statistical measure which is a regression model that represents the proportion of the difference or variance in statistical terms for a dependent variable which can be explained by an independent variable or variables. In short, it determines how well data will fit the regression model.

R Squared Formula

For the calculation of R squared, you need to determine the Correlation coefficient, and then you need to square the result.

R Squared Formula = r2

Where r the correlation coefficient can be calculated per below:


  • r = The Correlation coefficient
  • n = number in the given dataset
  • x = first variable in the context
  • y = second variable


If there is any relationship or correlation which may be linear or non-linear between those two variables, then it shall indicate if there is a change in the independent variable in value, then the other dependent variable will likely change in value, say linearly or non-linearly.

The numerator part of the formula conducts a test whether they move together and removes their individual movements and relative strength of both of them moving together, and the denominator part of the formula scales the numerator by taking the square root of the product of the differences of the variables from their squared variables. And when you squared this result, we get R squared, which is nothing but the coefficient of determination.

Example #1

Consider the following two variables x and y, you are required to calculate the R Squared in Regression.


Using the above-mentioned formula, we need to first calculate the correlation coefficient.

We have all the values in the above table with n = 4.

Let’s now input the values in the formula to arrive at the figure.

r = ( 4 * 26,046.25 ) – ( 265.18 * 326.89 )/ √ [(4 * 21,274.94) – (326.89)2] * [(4 * 31,901.89) – (326.89)2]

r = 17,501.06 / 17,512.88

The Correlation Coefficient will be-

r = 0.99932480

So, the calculation will be as follows,

r2 = (0.99932480)2

R Squared Formula in Regression

r2 = 0.998650052

Example #2

India, a developing country, wants to conduct an independent analysis of whether changes in crude oil prices have affected its rupee value. Following is the history of Brent crude oil price and Rupee valuation both against dollars that prevailed on an average for those years per below.

RBI, the central bank of India, has approached you to provide a presentation on the same in the next meeting. Determine whether the movements in crude oil affects movements in Rupee per dollar?


Using the formula for the correlation above, we can calculate the correlation coefficient first. Treating average crude oil price as one variable, say x, and treating Rupee per dollar as another variable as y.

We have all the values in the above table with n = 6.

Let’s now input the values in the formula to arrive at the figure.

r = (6 * 23592.83) – (356.70 * 398.59) / √ [(6 * 22829.36) – (356.70)2] * [(6 * 26529.38) – (398.59)2]

r = -620.06 / 1,715.95

The Correlation Coefficient will be-

r = -0.3614

So, the calculation will be as follows,

r2 = (-0.3614)2

R Squared Formula in Regression

r2 = 0.1306

Analysis: It appears that there is a minor relationship between changes in crude oil prices and changes in the price of the Indian rupee. As Crude oil price increases, the changes in the Indian rupee also affects. But since R squared is only 13%, then the changes in crude oil price explain very less about changes in the Indian rupee, and the Indian rupee is subject to changes in other variables as well, which needs to be accounted for.

Example #3

XYZ laboratory is conducting research on height and weight and is interested in knowing if there is any kind of relationship between these variables. After gathering a sample of 5000 people for every category and came up with an average weight and average height in that particular group.

Below are the details that they have gathered.

You are required to calculate R Squared and conclude if this model explains the variances in height affects variances in weight.


Using the formula for the correlation above, we can calculate the correlation coefficient first. Treating height as one variable, say x, and treating weight as another variable as y.

We have all the values in the above table with n = 6.

Let’s now input the values in the formula to arrive at the figure.

r = ( 7 * 74,058.67 ) – (1031 * 496.44) / √[(7 * 153595 – (1031)2] * [(7 * 35793.59) – (496.44)2]

r = 6,581.05 / 7,075.77

The Correlation Coefficient will be-

Correlation Coefficient (r) = 0.9301

So, the calculation will be as follows,

r2 = 0.8651

Analysis: The correlation is positive, and it appears there is some relationship between height and weight. As the height increases, the weight of the person also appears to be increased. While R2 suggests that 86% of changes in height attributes to changes in weight, and 14% are unexplained.

Solved Examples

Question 1: Find the coefficient of determination for the following set of data:

 X Y
 2 2
 5 5
 6 4
7 3


Given data is

 X Y
 2 2
 5 5
 6 4
 7 3

Create the table out of given scores

 X Y XY X2 Y2
 2 2   4  4  4
 5 5  25  25  25
 6 4  24  36  16
 7 3  21  49  9
 ∑X=20 ∑Y=14 ∑XY=74  ∑X2=114  ∑Y2=54


N = 4
The summation of each column has been given at the end of the column.
Now, according to the formula:

The coefficient of correlation is given by

Related Topics

Completing the Square Formula

Associative Property Formula

Commutative Property Formula ⭐️⭐️⭐️⭐️⭐

Square Root Property Formula ⭐️⭐️⭐️⭐️⭐️

Chi Square Formula

Root Mean Square Formula ⭐️⭐️⭐️⭐️⭐️

Diagonal Of A Square Formula ⭐️⭐️⭐️⭐️⭐

Perfect Square Trinomial Formula ⭐️⭐️⭐️⭐️⭐️

Perimeter of a Square Formula ⭐️⭐️⭐️⭐️⭐️

R Squared Formula ⭐️⭐️⭐️⭐️⭐️

Regression Sum of Squares Formula ⭐️⭐️⭐️⭐️⭐️

​Regular Square Pyramid Formula ⭐️⭐️⭐️⭐️⭐️

Secant Square x Formula ⭐️⭐️⭐️⭐️⭐️

Sin squared x formula ⭐️⭐️⭐️⭐️⭐️

Square Formula ⭐️⭐️⭐️⭐️⭐️

Sum of Squares Formula

Area of a Square Formula

Formula for calculating the perimeter of a square

Surface Area of a Square Pyramid Formula

Volume of a Square Pyramid Formula

Hãy bình luận đầu tiên

Để lại một phản hồi

Thư điện tử của bạn sẽ không được hiện thị công khai.