Hi there, I have a rather large data set and perform the following cox model: test1 <- list(tstart,tstop,death1,chemo1,radio1,horm1) out1<-coxph( Surv(tstart,tstop, death1) ~ chemo1+chemo1:log(tstop+1)+horm1+horm1:log(tstop+1)+age1+grade1+grade1:log(tstop+1)+positive1+positive1:log(tstop+1)+size1+size1:log(tstop+1), test1) out1 Up to here everything works fine (with each covariate having a length of 289205) Now I want to see a specific profile of the above model so I ask for: x11() profilbig2=(survfit(out1,newdata=data.frame(chemo1=rep(0,length(chemo1)),horm1=rep(0,length(chemo1)),age1=rep(mean(age1),length(chemo1)),grade1=rep(0,length(chemo1)),positive1=rep(1,length(chemo1)),size1=rep(mean(size1),length(chemo1)) ))) plot(profilbig2,col="blue") and I get the following error: Error: cannot allocate vector of size 1.5 Gb In addition: Warning messages: 1: In vector("double", length) : Reached total allocation of 1535Mb: see help(memory.size) 2: In vector("double", length) : Reached total allocation of 1535Mb: see help(memory.size) 3: In vector("double", length) : Reached total allocation of 1535Mb: see help(memory.size) 4: In vector("double", length) : Reached total allocation of 1535Mb: see help(memory.size) I am wondering why is that happening since I manage to fit the model. Shouldn't the memory problem pop up earlier when I was fitting the model? So now I fit the model but still cannot study a ceratin profile?? Can anyone suggest something? I am not an advanced user of R, do I type something wrong or can I do something more clever to see a profile of a hypothetical subject? Thanx in advance for any answers.. (2 GB RAM). P.S. I noticed in the help the "--max-mem-size" but I am not quite sure how to use it.. Leo. ___________________________________________________________ ×ñçóéìïðïéåßôå Yahoo!; ÂáñåèÞêáôå ôá åíï÷ëçôéêÜ ìçíýìáôá (spam); Ôï Yahoo! Mail äéáèÝôåé ôçí êáëýôåñç äõíáôÞ ðñïóôáóßá êáôÜ ôùí åíï÷ëçôéêþí ìçíõìÜôùí http://login.yahoo.com/config/mail?.intl=gr [[alternative HTML version deleted]]
On Sep 21, 2009, at 7:27 PM, ???????? ??????? wrote:> > Hi there, > > I have a rather large data set and perform the following cox model: > > > test1 <- list(tstart,tstop,death1,chemo1,radio1,horm1) > out1<-coxph( Surv(tstart,tstop, death1) ~ chemo1+chemo1:log(tstop > +1)+horm1+horm1:log(tstop+1)+age1+grade1+grade1:log(tstop > +1)+positive1+positive1:log(tstop+1)+size1+size1:log(tstop+1), test1) > out1 > > Up to here everything works fine (with each covariate having a > length of 289205) > Now I want to see a specific profile of the above model so I ask for: > > x11() > profilbig2 > =(survfit(out1,newdata=data.frame(chemo1=rep(0,length(chemo1)),> horm1=rep(0,length(chemo1)),> age1=rep(mean(age1),length(chemo1)),> grade1=rep(0,length(chemo1)),> positive1=rep(1,length(chemo1)),> size1=rep(mean(size1),length(chemo1)) ))) > plot(profilbig2,col="blue")I am a bit puzzled here. I do not see much variation within the newdata object. If my wetware R interpreter is working, then I wonder if you couldn't just use: newdata=data.frame(chemo1=0, horm1=0, age1=mean(age1), grade1=0, positive1=1, size1=mean(size1) )> > and I get the following error: > > Error: cannot allocate vector of size 1.5 Gb > In addition: Warning messages: > 1: In vector("double", length) : > Reached total allocation of 1535Mb: see help(memory.size) > 2: In vector("double", length) : > Reached total allocation of 1535Mb: see help(memory.size) > 3: In vector("double", length) : > Reached total allocation of 1535Mb: see help(memory.size) > 4: In vector("double", length) : > Reached total allocation of 1535Mb: see help(memory.size) > > > I am wondering why is that happening since I manage to fit the > model. Shouldn't the memory problem pop up earlier when I was > fitting the model? So now I fit the model but still cannot study a > ceratin profile?? Can anyone suggest something? I am not an advanced > user of R, do I type something wrong or can I do something more > clever to see a profile of a hypothetical subject?I don't claim to be an advanced user, but I have had the same question. My tentative answer is that the coxph created object does not save the "baseline survival estimate" and that survfit needs to recreate it. I wonder whether you need to use a newdat object that is quite so long?> > Thanx in advance for any answers.. > (2 GB RAM). > > P.S. I noticed in the help the "--max-mem-size" but I am not quite > sure how to use it..It's going to depend on your OS which you have not mentioned. If it is Windoze then see your OS-specific FAQ. If Linux or Mac then the only answer (assuming you are not satisfied with a solution that uses say 100 or 1000 points) is buy more memory. -- David Winsemius, MD Heritage Laboratories West Hartford, CT
If I use the following newdata=data.frame(chemo1=0, horm1=0, age1=mean(age1), grade1=0, positive1=1, size1=mean(size1) ) then I get Error in model.frame.default(Terms, newdata, ...) : variable lengths differ (found for 'log(tstop + 1)') In addition: Warning message: 'newdata' had 1 rows but variable(s) found have 289205 rows and that is why I used "rep" to create the profile of a hypothetical subject I want to see.. I've seen the help and the web and the syntax for a hypothetical subject is indeed as you mentioned, but it is not working (or am I making a mistake?). Once you declare newdata as the above then newdata is only a row which will conflict with the dimensions of out1. Am I not right? But when I use the rep function to create 289205 zeros for chemo1 horm1 and so on I get an out of memory error. I use Windows XP professional. Thanx for your answer David. ___________________________________________________________ ×ñçóéìïðïéåßôå Yahoo!; ÂáñåèÞêáôå ôá åíï÷ëçôéêÜ ìçíýìáôá (spam); Ôï Yahoo! Mail äéáèÝôåé ôçí êáëýôåñç äõíáôÞ ðñïóôáóßá êáôÜ ôùí åíï÷ëçôéêþí ìçíõìÜôùí http://login.yahoo.com/config/mail?.intl=gr [[alternative HTML version deleted]]