Hi all, When I tried to estimate a VAR (package vars) of a rather large dataset with 5 lags:> dim(trial.var)[1] 20388???? 2 I ran into memory troubles:> summary(VAR(trial.var, type="none", p=5))Error: cannot allocate vector of size 3.1 Gb In addition: Warning messages: 1: In diag(resids %*% solve(Sigma) %*% t(resids)) : ? Reached total allocation of 1535Mb: see help(memory.size) 2: In diag(resids %*% solve(Sigma) %*% t(resids)) : ? Reached total allocation of 1535Mb: see help(memory.size) 3: In diag(resids %*% solve(Sigma) %*% t(resids)) : ? Reached total allocation of 1535Mb: see help(memory.size) 4: In diag(resids %*% solve(Sigma) %*% t(resids)) : ? Reached total allocation of 1535Mb: see help(memory.size) Luckily, I was able to slice and dice my dataset into individual days with ca. 3000 lines each and estimated each subset. Now, I nonetheless would like to run the VAR over the whole set. Is there any way I can extend the memory used by R? Perhaps forcing it? I am running R on a XP box with 1GB RAM. Many thanks for any pointers. Bernd ------------------
Dear Bernd, which version of the package vars are you using? Have tried estimating estimating the VAR first and only? Within the function VAR() the equations are estimated by lm(). Would you be so kind and send the result of traceback()? Best, Bernhard>-----Urspr?ngliche Nachricht----- >Von: r-help-bounces at r-project.org >[mailto:r-help-bounces at r-project.org] Im Auftrag von >herrdittmann at yahoo.co.uk >Gesendet: Samstag, 15. August 2009 15:47 >An: r-help at r-project.org >Betreff: [R] VAR (pckg: vars) and memory problem > >Hi all, > >When I tried to estimate a VAR (package vars) of a rather >large dataset with 5 lags: > > >> dim(trial.var) >[1] 20388???? 2 > > >I ran into memory troubles: > > >> summary(VAR(trial.var, type="none", p=5)) >Error: cannot allocate vector of size 3.1 Gb >In addition: Warning messages: >1: In diag(resids %*% solve(Sigma) %*% t(resids)) : >? Reached total allocation of 1535Mb: see help(memory.size) >2: In diag(resids %*% solve(Sigma) %*% t(resids)) : >? Reached total allocation of 1535Mb: see help(memory.size) >3: In diag(resids %*% solve(Sigma) %*% t(resids)) : >? Reached total allocation of 1535Mb: see help(memory.size) >4: In diag(resids %*% solve(Sigma) %*% t(resids)) : >? Reached total allocation of 1535Mb: see help(memory.size) > > >Luckily, I was able to slice and dice my dataset into >individual days with ca. 3000 lines each and estimated each subset. > >Now, I nonetheless would like to run the VAR over the whole set. > >Is there any way I can extend the memory used by R? Perhaps >forcing it? I am running R on a XP box with 1GB RAM. > > >Many thanks for any pointers. > >Bernd >------------------ >______________________________________________ >R-help at r-project.org mailing list >https://stat.ethz.ch/mailman/listinfo/r-help >PLEASE do read the posting guide >http://www.R-project.org/posting-guide.html >and provide commented, minimal, self-contained, reproducible code. >***************************************************************** Confidentiality Note: The information contained in this ...{{dropped:10}}
Dear Bernard,? Please find attached the output of traceback() below for this rather large VAR. I am using vars 1.4-6 at the moment:> dim(var.trial)[1] 22367???? 3??> summary(VAR(var.trial, type="none", p=3))Error: cannot allocate vector of size 3.7 GbIn addition: Warning messages: 1: In diag(resids %*% solve(Sigma) %*% t(resids)) :? Reached total allocation of 1535Mb: see help(memory.size) 2: In diag(resids %*% solve(Sigma) %*% t(resids)) :? Reached total allocation of 1535Mb: see help(memory.size) 3: In diag(resids %*% solve(Sigma) %*% t(resids)) :? Reached total allocation of 1535Mb: see help(memory.size) 4: In diag(resids %*% solve(Sigma) %*% t(resids)) :? Reached total allocation of 1535Mb: see help(memory.size)??> traceback()6: diag(resids %*% solve(Sigma) %*% t(resids))5: logLik.varest(object) 4: logLik(object) 3: summary.varest(VAR(var.trial, type = "none", p = 3)) 2: summary(VAR(var.trial, type = "none", p = 3)) 1: summary(VAR(var.trial, type = "none", p = 3))?? Oddly enough, VAR(?) itself returned the regressors.?Am I missing something rather trivial?? Many thanks in advance and best regards, Bernd ------------------ -----Original Message----- From: "Pfaff, Bernhard Dr." <Bernhard_Pfaff at fra.invesco.com> Date: Mon, 17 Aug 2009 09:03:03 To: <herrdittmann at yahoo.co.uk>; <r-help at r-project.org> Subject: AW: [R] VAR (pckg: vars) and memory problem Dear Bernd, which version of the package vars are you using? Have tried estimating estimating the VAR first and only? Within the function VAR() the equations are estimated by lm(). Would you be so kind and send the result of traceback()? Best, Bernhard>-----Urspr?ngliche Nachricht----- >Von: r-help-bounces at r-project.org >[mailto:r-help-bounces at r-project.org] Im Auftrag von >herrdittmann at yahoo.co.uk >Gesendet: Samstag, 15. August 2009 15:47 >An: r-help at r-project.org >Betreff: [R] VAR (pckg: vars) and memory problem > >Hi all, > >When I tried to estimate a VAR (package vars) of a rather >large dataset with 5 lags: > > >> dim(trial.var) >[1] 20388???? 2 > > >I ran into memory troubles: > > >> summary(VAR(trial.var, type="none", p=5)) >Error: cannot allocate vector of size 3.1 Gb >In addition: Warning messages: >1: In diag(resids %*% solve(Sigma) %*% t(resids)) : >? Reached total allocation of 1535Mb: see help(memory.size) >2: In diag(resids %*% solve(Sigma) %*% t(resids)) : >? Reached total allocation of 1535Mb: see help(memory.size) >3: In diag(resids %*% solve(Sigma) %*% t(resids)) : >? Reached total allocation of 1535Mb: see help(memory.size) >4: In diag(resids %*% solve(Sigma) %*% t(resids)) : >? Reached total allocation of 1535Mb: see help(memory.size) > > >Luckily, I was able to slice and dice my dataset into >individual days with ca. 3000 lines each and estimated each subset. > >Now, I nonetheless would like to run the VAR over the whole set. > >Is there any way I can extend the memory used by R? Perhaps >forcing it? I am running R on a XP box with 1GB RAM. > > >Many thanks for any pointers. > >Bernd >------------------ >______________________________________________ >R-help at r-project.org mailing list >https://stat.ethz.ch/mailman/listinfo/r-help >PLEASE do read the posting guide >http://www.R-project.org/posting-guide.html >and provide commented, minimal, self-contained, reproducible code. >***************************************************************** Confidentiality Note: The information contained in this message, and any attachments, may contain confidential and/or privileged material. It is intended solely for the person(s) or entity to which it is addressed. Any review, retransmission, dissemination, or taking of any action in reliance upon this information by persons or entities other than the intended recipient(s) is prohibited. If you received this in error, please contact the sender and delete the material from any computer. *****************************************************************
Hello Bernd, many thanks for providing the details. As you can see from traceback, the warning refers to the calculation of the value of the log-likelihood, which is used in vars:::logLik.varest. I will slice the calculation and hopefully this will resolve the memory problems (update on R-Forge first and then on CRAN). In the meantime, you can use the summary method for lm objects and lapply these: lapply(foo$varresult, summary) in order to obtain summary results for the individual equations. Best, Bernhard>-----Urspr?ngliche Nachricht----- >Von: herrdittmann at yahoo.co.uk [mailto:herrdittmann at yahoo.co.uk] >Gesendet: Montag, 17. August 2009 18:27 >An: Pfaff, Bernhard Dr.; r-help at r-project.org >Betreff: Re: AW: [R] VAR (pckg: vars) and memory problem > >Dear Bernard,? > > >Please find attached the output of traceback() below for this >rather large VAR. I am using vars 1.4-6 at the moment: > > >> dim(var.trial) >[1] 22367???? 3?? > > >> summary(VAR(var.trial, type="none", p=3)) >Error: cannot allocate vector of size 3.7 GbIn addition: >Warning messages: >1: In diag(resids %*% solve(Sigma) %*% t(resids)) :? Reached >total allocation of 1535Mb: see help(memory.size) >2: In diag(resids %*% solve(Sigma) %*% t(resids)) :? Reached >total allocation of 1535Mb: see help(memory.size) >3: In diag(resids %*% solve(Sigma) %*% t(resids)) :? Reached >total allocation of 1535Mb: see help(memory.size) >4: In diag(resids %*% solve(Sigma) %*% t(resids)) :? Reached >total allocation of 1535Mb: see help(memory.size)?? > > >> traceback() >6: diag(resids %*% solve(Sigma) %*% t(resids))5: logLik.varest(object) >4: logLik(object) >3: summary.varest(VAR(var.trial, type = "none", p = 3)) >2: summary(VAR(var.trial, type = "none", p = 3)) >1: summary(VAR(var.trial, type = "none", p = 3))?? > > >Oddly enough, VAR(...) itself returned the regressors.?Am I >missing something rather trivial?? > >Many thanks in advance and best regards, > >Bernd > >------------------ > >-----Original Message----- >From: "Pfaff, Bernhard Dr." <Bernhard_Pfaff at fra.invesco.com> > >Date: Mon, 17 Aug 2009 09:03:03 >To: <herrdittmann at yahoo.co.uk>; <r-help at r-project.org> >Subject: AW: [R] VAR (pckg: vars) and memory problem > > >Dear Bernd, > >which version of the package vars are you using? Have tried >estimating estimating the VAR first and only? Within the >function VAR() the equations are estimated by lm(). Would you >be so kind and send the result of traceback()? > >Best, >Bernhard > > >>-----Urspr?ngliche Nachricht----- >>Von: r-help-bounces at r-project.org >>[mailto:r-help-bounces at r-project.org] Im Auftrag von >>herrdittmann at yahoo.co.uk >>Gesendet: Samstag, 15. August 2009 15:47 >>An: r-help at r-project.org >>Betreff: [R] VAR (pckg: vars) and memory problem >> >>Hi all, >> >>When I tried to estimate a VAR (package vars) of a rather >>large dataset with 5 lags: >> >> >>> dim(trial.var) >>[1] 20388???? 2 >> >> >>I ran into memory troubles: >> >> >>> summary(VAR(trial.var, type="none", p=5)) >>Error: cannot allocate vector of size 3.1 Gb >>In addition: Warning messages: >>1: In diag(resids %*% solve(Sigma) %*% t(resids)) : >>? Reached total allocation of 1535Mb: see help(memory.size) >>2: In diag(resids %*% solve(Sigma) %*% t(resids)) : >>? Reached total allocation of 1535Mb: see help(memory.size) >>3: In diag(resids %*% solve(Sigma) %*% t(resids)) : >>? Reached total allocation of 1535Mb: see help(memory.size) >>4: In diag(resids %*% solve(Sigma) %*% t(resids)) : >>? Reached total allocation of 1535Mb: see help(memory.size) >> >> >>Luckily, I was able to slice and dice my dataset into >>individual days with ca. 3000 lines each and estimated each subset. >> >>Now, I nonetheless would like to run the VAR over the whole set. >> >>Is there any way I can extend the memory used by R? Perhaps >>forcing it? I am running R on a XP box with 1GB RAM. >> >> >>Many thanks for any pointers. >> >>Bernd >>------------------ >>______________________________________________ >>R-help at r-project.org mailing list >>https://stat.ethz.ch/mailman/listinfo/r-help >>PLEASE do read the posting guide >>http://www.R-project.org/posting-guide.html >>and provide commented, minimal, self-contained, reproducible code. >> >***************************************************************** >Confidentiality Note: The information contained in this message, >and any attachments, may contain confidential and/or privileged >material. It is intended solely for the person(s) or entity to >which it is addressed. Any review, retransmission, dissemination, >or taking of any action in reliance upon this information by >persons or entities other than the intended recipient(s) is >prohibited. If you received this in error, please contact the >sender and delete the material from any computer. >***************************************************************** > >
On Sat, 15 Aug 2009 13:46:56 +0000 herrdittmann at yahoo.co.uk wrote: HCU> I ran into memory troubles: HCU> HCU> > summary(VAR(trial.var, type="none", p=5)) HCU> Error: cannot allocate vector of size 3.1 Gb no wonder. HCU> Is there any way I can extend the memory used by R? Perhaps HCU> forcing it? I am running R on a XP box with 1GB RAM. See: http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-uses_0021 So either you try to decrease the size of your variables or try different ways running a 64bit Linux if your PC is able to do that. more RAM would also be preferable since virtual memory is much slower. Other options include maybe the ff package. But it will be quite slow. hth Stefan