Linear Regression Slope
b₁ = Σ(xᵢ - x̄)(yᵢ - ȳ) / Σ(xᵢ - x̄)²
The least-squares regression slope quantifies the average change in the response variable (y) for a one-unit increase in the explanatory variable (x). It is the slope of the best-fitting straight line that minimizes the sum of squared residuals.
Variables
The change in y for each one-unit change in x
The individual paired observations
The sample means of x and y
Example Calculation
Scenario
Using the same study hours (x) and exam scores (y) data: (2,65), (4,78), (5,82), (6,90), (8,95). Calculate the regression slope.
Given Data
Calculation
b₁ = 102 / 20
Result
b₁ = 5.1
Interpretation
For each additional hour of studying, the predicted exam score increases by 5.1 points on average. The regression line is ŷ = 56.5 + 5.1x (where b₀ = 82.0 - 5.1(5.0) = 56.5).
When to Use This Formula
- ✓Modeling the linear relationship between an explanatory variable and a response variable
- ✓Predicting values of y for given values of x
- ✓Estimating the rate of change between two quantitative variables
Common Mistakes
- ✗Extrapolating far beyond the range of the observed x values
- ✗Fitting a linear model when the relationship is clearly nonlinear
- ✗Confusing the slope with the correlation coefficient
- ✗Ignoring influential points or outliers that can disproportionately affect the slope
Calculate This Formula Instantly
Snap a photo of any problem and get step-by-step solutions.
Download StatsIQFAQs
Common questions about this formula
The slope and correlation are related by b₁ = r(s_y / s_x), where s_y and s_x are the standard deviations of y and x. They always have the same sign, but the slope depends on the scales of measurement while the correlation does not.
The y-intercept is b₀ = ȳ - b₁ × x̄. It represents the predicted value of y when x equals 0. In many applications, the intercept may not have a meaningful interpretation if x = 0 is outside the range of the data.