Explorable.com198.2K reads

Correlation and linear regression are the most commonly used techniques for investigating the relationship between two quantitative variables.

Discover 34 more articles on this topic

Don't miss these related articles:

- 1Statistical Hypothesis Testing
- 2Relationships
- 3Correlation
- 4Regression
- 5Student’s T-Test
- 6ANOVA
- 7Nonparametric Statistics
- 8Other Ways to Analyse Data

The goal of a correlation analysis is to see whether two measurement variables co vary, and to quantify the strength of the relationship between the variables, whereas regression expresses the relationship in the form of an equation.

For example, in students taking a Maths and English test, we could use correlation to determine whether students who are good at Maths tend to be good at English as well, and regression to determine whether the marks in English can be predicted for given marks in Maths.

The starting point is to draw a scatter of points on a graph, with one variable on the X-axis and the other variable on the Y-axis, to get a feel of the relationship (if any) between the variables as suggested by the data. The closer the points are to a straight line, the stronger the linear relationship between two variables.

We can use the correlation coefficient, such as the Pearson Product Moment Correlation Coefficient, to test if there is a linear relationship between the variables. To quantify the strength of the relationship, we can calculate the correlation coefficient (r). Its numerical value ranges from +1.0 to -1.0. r > 0 indicates positive linear relationship, r < 0 indicates negative linear relationship while r = 0 indicates no linear relationship.

It must, however, be considered that there may be a third variable related to both of the variables being investigated, which is responsible for the apparent correlation. Correlation does not imply causation. Also, a nonlinear relationship may exist between two variables that would be inadequately described, or possibly even undetected, by the correlation coefficient.

In regression analysis, the problem of interest is the nature of the relationship itself between the dependent variable (response) and the (explanatory) independent variable.

The analysis consists of choosing and fitting an appropriate model, done by the method of least squares, with a view to exploiting the relationship between the variables to help estimate the expected response for a given value of the independent variable. For example, if we are interested in the effect of age on height, then by fitting a regression line, we can predict the height for a given age.

Some underlying assumptions governing the uses of correlation and regression are as follows.

The observations are assumed to be independent. For correlation, both variables should be random variables, but for regression only the dependent variable Y must be random. In carrying out hypothesis tests, the response variable should follow Normal distribution and the variability of Y should be the same for each value of the predictor variable. A scatter diagram of the data provides an initial check of the assumptions for regression.

There are three main uses for correlation and regression.

- One is to test hypotheses about cause-and-effect relationships. In this case, the experimenter determines the values of the X-variable and sees whether variation in X causes variation in Y. For example, giving people different amounts of a drug and measuring their blood pressure.
- The second main use for correlation and regression is to see whether two variables are associated, without necessarily inferring a cause-and-effect relationship. In this case, neither variable is determined by the experimenter; both are naturally variable. If an association is found, the inference is that variation in X may cause variation in Y, or variation in Y may cause variation in X, or variation in some other factor may affect both X and Y.
- The third common use of linear regression is estimating the value of one variable corresponding to a particular value of the other variable.

Full reference:

Explorable.com (Jan 18, 2010). Correlation and Regression. Retrieved May 25, 2022 from Explorable.com: https://verify.explorable.com/correlation-and-regression

The text in this article is licensed under the Creative Commons-License Attribution 4.0 International (CC BY 4.0).

This means you're free to copy, share and adapt any parts (or all) of the text in the article, as long as you give ** appropriate credit** and

That is it. You don't need our permission to copy the article; just include a link/reference back to this page. You can use it freely (with some kind of link), and we're also okay with people reprinting in publications like books, blogs, newsletters, course-material, papers, wikipedia and presentations (with clear attribution).

Discover 34 more articles on this topic

Don't miss these related articles:

- 1Statistical Hypothesis Testing
- 2Relationships
- 3Correlation
- 4Regression
- 5Student’s T-Test
- 6ANOVA
- 7Nonparametric Statistics
- 8Other Ways to Analyse Data

Thank you to...

This project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 827736.

Subscribe / Share

- Subscribe to our RSS Feed
- Like us on Facebook
- Follow us on Twitter
- Founder:
- Oskar Blakstad Blog
- Oskar Blakstad on Twitter

Explorable.com - 2008-2022

You are free to copy, share and adapt any text in the article, as long as you give *appropriate credit* and *provide a link/reference* to this page.