Hi all,
need help very urgently
I did stepwise logistic regression for 35 covariates and added one SNP out
of (500000) to get the best model for each model
As my professor asked me
using this command,
outfiles <- paste(colnames(snps), ".txt", sep="") # list
of output files
for the best models
for(i in 1:ncol(snps)) {
model <- glm (Pheno~var1+var2+var3+..(all covariates)...+snps[,i] ,
data=covar , family=binomial("logit"))
st <- step(model)
out <- data.frame(FID=covar$FID, IID=covar$IID, st$model) # formatting
for plink
write.table(out, outfiles[i], col.names=TRUE, row.names=FALSE,
quote=FALSE, sep="\t")
}
and i end up having 500,000 files for each SNP
1- Is this way is approperiate in assessing the fittness of logistic model ?
2- how can i now use all of these file in a way that i dont need to write
the glm formula for each file ?
3- Is there is a better way to save the results so i will not have to deal
with 500,000 files ?
Please need your help
Alajmi
[[alternative HTML version deleted]]