Hi all r-mailling listers:
Can anyone explain the theory (or the formula) about computing Sum Sq
(color highligh below) related to regression items? The link of Wikipedia (
http://en.wikipedia.org/wiki/Partition_of_sums_of_squares) gives an
introduction on how to calculate the total, model, and regression sum of
squares. Is it similar to the Sum Sq computation? Is the regression sum of
squares equal to (0.000437+ 0.002545+ 0.060984+ 0.062330+ 0.060480)?
Any suggestion will be greatly appreciated.
Thank you!
David
TraingData<-data.frame(
x1=c(3.532,2.868,2.868,3.532,2.868,2.536,3.864),
x2=c(1.992,1.992,1.328,1.328,1.328,1.66,1.66),
y=c(9.040330254,8.900894412,8.701929163,9.057944749,8.701929163,8.74317832,9.10859913)
)
lm.sol<-lm(y~1+x1+x2+I(x1^2)+I(x2^2)+I(x1*x2),data=TraingData)
anova(lm.sol)
Analysis of Variance Table
Response: y
Df *Sum Sq* Mean Sq F value Pr(>F)
x1 1 0.000437 0.000437 0.1055 0.8001
x2 1 0.002545 0.002545 0.6141 0.5768
I(x1^2) 1 0.060984 0.060984 14.7162 0.1623
I(x2^2) 1 0.062330 0.062330 15.0409 0.1607
I(x1 * x2) 1 0.060480 0.060480 14.5945 0.1630
Residuals 1 0.004144 0.004144
[[alternative HTML version deleted]]