Linear regression and the test of significance of regression coefficients

by Xiaoqi Zheng, 03/18/2020

In [8]:
## load the "rock" data set
dim(rock)
head(rock)
plot(rock)
  1. 48
  2. 4
A data.frame: 6 × 4
areaperishapeperm
<int><dbl><dbl><dbl>
149902791.900.0903296 6.3
270023892.600.1486220 6.3
375583930.660.1833120 6.3
473523869.320.1170630 6.3
579433948.540.122417017.1
679794010.150.167045017.1
In [4]:
## linear regression by lm
lm.out = lm(perm ~ ., data = rock)
summary(lm.out)
Call:
lm(formula = perm ~ ., data = rock)

Residuals:
    Min      1Q  Median      3Q     Max 
-750.26  -59.57   10.66  100.25  620.91 

Coefficients:
             Estimate Std. Error t value Pr(>|t|)    
(Intercept) 485.61797  158.40826   3.066 0.003705 ** 
area          0.09133    0.02499   3.654 0.000684 ***
peri         -0.34402    0.05111  -6.731 2.84e-08 ***
shape       899.06926  506.95098   1.773 0.083070 .  
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 246 on 44 degrees of freedom
Multiple R-squared:  0.7044,	Adjusted R-squared:  0.6843 
F-statistic: 34.95 on 3 and 44 DF,  p-value: 1.033e-11
In [6]:
## If we want to know whether the coefficient of "shape" is significant:
summary(lm.out)$coeff["shape","Pr(>|t|)"] ## pvalue
0.0830700148141587
In [10]:
## See if the result is the same with that by Normal equation:  
X = as.matrix(cbind(1,rock[,-4])) ## don't foget to add an additional column of '1'
Y = rock[,4]
Y_hat = X %*% (solve(t(X)%*%X) %*% t(X) %*% Y) # normal equation
plot(lm.out$fit,Y_hat) # the same