Solving Ordinary Least Squares (OLS) Regression Using Matrix Algebra

matrix algebra
R
statistics
Published

January 30, 2019

In psychology, we typically learn how to calculate OLS regression by calculating each coefficient separately. However, I recently learned how to calculate this using matrix algebra. Here is a brief tutorial on how to perform this using R.


1 R Packages

Code
packages <- c("tidyverse", "broom")
xfun::pkg_attach(packages, message = F)


2 Dataset

Code
dataset <- carData::Salaries %>%
  select(salary, yrs.since.phd) %>%
  mutate(yrs.since.phd = scale(yrs.since.phd, center = T, scale = F))
Code
summary(dataset)
     salary             yrs.since.phd.V1      
 Min.   : 57800   Min.   :-21.31486146100000  
 1st Qu.: 91000   1st Qu.:-10.31486146100000  
 Median :107300   Median : -1.31486146096000  
 Mean   :113706   Mean   : -0.00000000000001  
 3rd Qu.:134185   3rd Qu.:  9.68513853904000  
 Max.   :231545   Max.   : 33.68513853900000  

The Salaries dataset is from the carData package, which shows the salary of professors in the US during the academic year of 2008 and 2009. Let’s say we are interested in determining if professors who have had their Ph.D. degree for longer are more likely to also have higher salaries.


3 Solve Using Matrix Algebra

3.1 Design Matrix (\(X\))

The design matrix is just a dataset of the all the predictors, which includes the intercept set at 1 and yrs.since.phd.

Code
x <- tibble(
  intercept = 1,
  yrs.since.phd = as.numeric(dataset$yrs.since.phd)
) %>%
  as.matrix()
head(x)
     intercept yrs.since.phd
[1,]         1     -3.314861
[2,]         1     -2.314861
[3,]         1    -18.314861
[4,]         1     22.685139
[5,]         1     17.685139
[6,]         1    -16.314861


3.2 Dependent Variable (\(Y\))

Code
y <- dataset$salary %>% as.matrix()
head(y)
       [,1]
[1,] 139750
[2,] 173200
[3,]  79750
[4,] 115000
[5,] 141500
[6,]  97000


3.3 \(X'X\)

First, we need to solve for \(X'X\), which is the transposed design matrix (\(X'\)) multiplied by the design matrix (\(X\)).

Let’s take a look at what \(X'\) looks like.

Code
x_transposed <- t(x)
x_transposed[, 1:6]
                   [,1]      [,2]      [,3]     [,4]     [,5]      [,6]
intercept      1.000000  1.000000   1.00000  1.00000  1.00000   1.00000
yrs.since.phd -3.314861 -2.314861 -18.31486 22.68514 17.68514 -16.31486


After multiplication, the matrix provides the total number of participants (\(n\) = 397; really, the sum of the intercept), sum of yrs.since.phd (\(\Sigma(yrs.since.phd)\) = 0), and sum of squared yrs.since.phd (\(\Sigma (yrs.since.phd^2)\) = 65765.64). Respectively, \(\Sigma (years.since.phd)\) and \(\Sigma (yrs.since.phd^2)\) are sum of error (\(\Sigma(yrs.since.phd-M_{yrs.since.phd})\)) and sum of squared error (\(\Sigma(yrs.since.phd-M_{yrs.since.phd})^2\)) because we first centered the yrs.since.phd variable.

Code
x_prime_x <- (x_transposed %*% x)
x_prime_x %>% round(., 2)
              intercept yrs.since.phd
intercept           397          0.00
yrs.since.phd         0      65765.64


Let’s verify this.

Code
colSums(x) %>% round(., 2)
    intercept yrs.since.phd 
          397             0 
Code
colSums(x^2) %>% round(., 2)
    intercept yrs.since.phd 
       397.00      65765.64 


3.4 \((X'X)^{-1}\)

\((X'X)^{-1}\) is the inverse matrix of \(X'X\).

Code
x_prime_x_inverse <- solve(x_prime_x)
x_prime_x_inverse
                 intercept yrs.since.phd
intercept     2.518892e-03  9.280150e-20
yrs.since.phd 9.280150e-20  1.520551e-05


3.5 \(X'Y\)

\(X'Y\) contains the sum of Y (\(\Sigma Y\) = 45141464) and sum of \(XY\) (\(\Sigma XY\) = 64801658).

Code
x_prime_y <- x_transposed %*% y
x_prime_y
                  [,1]
intercept     45141464
yrs.since.phd 64801658


Let’s verify this.

Code
sum(y)
[1] 45141464
Code
sum(x[, 2] * y)
[1] 64801658


3.6 Coefficients (\(B\))

To obtain the coefficients, we can multiply these last two matrices (\(B = (X'X)^{-1}X'Y\)).

Code
coef <- x_prime_x_inverse %*% x_prime_y
coef
                     [,1]
intercept     113706.4584
yrs.since.phd    985.3421


3.7 Standard Error

To calculate the standard error, we multiply the inverse matrix of \(X'X\) by the mean squared error (MSE) of the model and take the square root of its diagonal matrix (\(\sqrt{diag((X'X)^{-1} * MSE)}\)).


First, we need to calculate the \(MSE\) of the model. Calculating \(MSE\) of the model is still the same, \(MSE = \frac{\Sigma(Y-\hat{Y})^{2}}{n-p} = \frac{\Sigma(e^2)}{df}\) where \(Y\) is the DV, \(\hat{Y}\) is the predicted DV, \(n\) is the total number of participants (or data points), and \(p\) is the total number of variables in the design matrix (or predictors, which includes the intercept).


To obtain the predicted values (\(\hat{Y}\)), we can also use matrix algebra by multiplying the design matrix with the coefficients (\(\hat{Y} = XB\)).

Code
y_predicted <- x %*% coef
head(y_predicted)
          [,1]
[1,] 110440.19
[2,] 111425.53
[3,]  95660.05
[4,] 136059.08
[5,] 131132.37
[6,]  97630.74


Now that we have \(\hat{Y}\), we can then calculate the \(MSE\).

Code
e <- y - y_predicted
se <- sum(e^2)
n <- nrow(x)
p <- ncol(x)
df <- n - p
mse <- se / df
mse
[1] 758098328


Then, we multiply \((X'X)^{-1}\) by MSE.

Code
mse_coef <- x_prime_x_inverse * mse
mse_coef %>% round(., 2)
              intercept yrs.since.phd
intercept       1909568          0.00
yrs.since.phd         0      11527.27


Then, we take the square root of the diagonal matrix to obtain the standard error of the coefficients.

Code
rmse_coef <- sqrt(diag(mse_coef))
rmse_coef %>% round(., 2)
    intercept yrs.since.phd 
      1381.87        107.37 


3.8 t-Statistic

The t-statistic is just the coefficient divided by the standard error of the coefficient.

Code
t_statistic <- as.numeric(coef) / as.numeric(rmse_coef)
t_statistic
[1] 82.284421  9.177488


3.9 p-Value

We want the probability of obtaining that score or more extreme and not the other way around. Thus, we need to set lower to FALSE. Also, we need to multiply it by 2 to obtain a two-tailed test.

Code
p_value <- 2 * pt(t_statistic, df, lower = FALSE)
p_value
[1] 1.070665e-250  2.495042e-18


3.10 Summary

Code
tibble(
  term = colnames(x),
  estimate = as.numeric(coef),
  std.error = as.numeric(rmse_coef),
  statistic = as.numeric(t_statistic),
  p.value = as.numeric(p_value)
)
# A tibble: 2 × 5
  term          estimate std.error statistic   p.value
  <chr>            <dbl>     <dbl>     <dbl>     <dbl>
1 intercept      113706.     1382.     82.3  1.07e-250
2 yrs.since.phd     985.      107.      9.18 2.50e- 18


4 Solve Using lm Function

Code
lm(salary ~ yrs.since.phd, dataset) %>% tidy()
# A tibble: 2 × 5
  term          estimate std.error statistic   p.value
  <chr>            <dbl>     <dbl>     <dbl>     <dbl>
1 (Intercept)    113706.     1382.     82.3  1.07e-250
2 yrs.since.phd     985.      107.      9.18 2.50e- 18