Hi Paul,
here's a lm model to illustrate this:
> summary(lm(y~x.1+x.2))
Call:
lm(formula = y ~ x.1 + x.2)
Residuals:
Min 1Q Median 3Q Max
-0.0561359 -0.0054020 0.0004553 0.0056516 0.0515817
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 0.0007941 0.0002900 2.738 0.006278 **
x.1 -0.0446746 0.0303192 -1.473 0.140901
x.2 0.1014467 0.0285513 3.553 0.000396 ***
---
Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05
'.' 0.1 ' ' 1
Residual standard error: 0.009774 on 1134 degrees of freedom
(64 observations deleted due to missingness)
Multiple R-Squared: 0.01336, Adjusted R-squared: 0.01162
F-statistic: 7.676 on 2 and 1134 DF, p-value: 0.0004883
summary(lm(...)) computes t-values and the resulting p-values for each
regressor.
The intercept is significant at 0.6%, similarly, x.2 is significant at
0.04%. Only x.1 is not significant at a conventional level of 5%. Its p
is 14%.
Overall significance of the model is given by the F stats (=7.676 at p
less than 0.05%).
Hope that helped.
Bernd
[[alternative HTML version deleted]]