similar to: Memory problem

Displaying 8 results from an estimated 8 matches similar to: "Memory problem"

2016 Apr 06
0
Memory problem
As Jim has indicated, memory usage problems can require very specific diagnostics and code changes, so generic help is tough to give. However, in most cases I have found the dplyr package to be more memory efficient than plyr, so you could consider that. Also, you can be explicit about only saving the minimum results you want to keep rather than making a list of complete results and extracting
2016 Apr 06
0
Memory problem
It is hard to tell from the information that you have provided. Do you have a list of the sizes of all the objects that you have in memory? Are you releasing large objects at the end of each simulation run? Are you using 'gc' to garbage collect any memory after deallocating objects? Collect some additional information with a simple function like below: f_mem_stats <-
2016 Apr 06
1
Memory problem
Dear Sir, Thanks for the guidance. Will check. And yes, at the end of each simulation, a large result is getting stored.? Regards Amelia On Wednesday, 6 April 2016 5:48 PM, jim holtman <jholtman at gmail.com> wrote: It is hard to tell from the information that you have provided.? Do you have a list of the sizes of all the objects that you have in memory?? Are you releasing large
2016 Apr 06
0
Memory problem
You say it is "getting stored"; is this in memory or on disk? How are you processing the results of the 1,000 simulations? So some more insight into the actual process would be useful. For example, how are the simulations being done, are the results stored in memory, or out to a file, what are you doing with the results at the end, etc. Jim Holtman Data Munger Guru What is the
2011 Sep 13
1
SVD Memory Issue
I am trying to perform Singular Value Decomposition (SVD) on a Term Document Matrix I created using the 'tm' package. Eventually I want to do a Latent Semantic Analysis (LSA). There are 5677 documents with 771 terms (the DTM is 771 x 5677). When I try to do the SVD, it runs out of memory. I am using a 12GB Dual core Machine with Windows XP and don't think I can increase the memory
2006 Jan 10
0
bug in either glmmPQL or lme/lmer
I know it's conventional to report bugs to the maintainer, but I'm not sure which package actually contains this bug(s), so I apologize for sending this to the list at large. I see the bug under both R 2.1.1, and R 2.2.1. (I sent a related message a while ago, but this one has more detail.) library(MASS) library(nlme) fit.model <- function(il, model.family) { cs <-
2009 Jul 02
1
Quantitative Risk Management by McNeil
Dear Specialists in R, May be somebody has experiment in using pakage for the book Quantitative Risk Management by McNeil? This package is writen in R. I have run this package for fitting the data to Nornal Inverse Gaussian distribution and fased with following problem. > Return<-read.csv("data.csv") > Transpose<-t(Return) > fit.NH(Transpose, case="NIG",
2010 Oct 13
0
mtrr error
I need your help and DEL's tech support doesn't provide any help on this one. We have a lot of different type of DELL desktops from old-type hyper-thread cpu, to dual-core and quad-core cpus (most are Xeons). We run all versions of CentOS, but most are latest 5.5 (also up-to-date) and are very happy about that. The primary software on those Linux systems is IDL, which uses OpenGL