similar to: growing process size in simulation

Displaying 20 results from an estimated 5000 matches similar to: "growing process size in simulation"

2008 Jul 20
2
Erro: cannot allocate vector of size 216.0 Mb
Please, I have a 2GB computer and a huge time-series to embedd, and i tried increasing memory.limit() and memory.size(max=TRUE), but nothing. Just before the command: > memory.size(max=TRUE) [1] 13.4375 > memory.limit() [1] 1535.875 > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 209552 5.6 407500 10.9 350000 9.4 Vcells 125966 1.0 786432 6.0 496686 3.8
2008 Sep 24
2
cannot allocate memory
I am getting "Error: cannot allocate vector of size 197 MB". I know that similar problems were discussed a lot already, but I didn't find any satisfactory answers so far! Details: *** I have XP (32bit) with 4GB ram. At the time when the problem appeared I had 1.5GB of available physical memory. *** I increased R memory limit to 3GB via memory.limit(3000) *** I did gs() and got
2009 Apr 26
6
Memory issues in R
How do people deal with R and memory issues? I have tried using gc() to see how much memory is used at each step. Scanned Crawley R-Book and all other R books I have available and the FAQ on-line but no help really found. Running WinXP Pro (32 bit) with 4 GB RAM. One SATA drive pair is in RAID 0 configuration with 10000 MB allocated as virtual memory. I do have another machine
2004 Feb 29
1
LCG with modulo 2^30
I can't run a function which generates random numbrers using linear congruential generator. My multiplier is a=5+8^6, increment is b=1 and modulo is m=2^30. the code I have written works for modulo upto m=2^28. For m= 2^29 , it says, can not allocate memory for the vector or something like that. For m= 2^31 or more, its says the argument "for i in 1:m " is too large in
2010 Jul 07
3
Large discrepancies in the same object being saved to .RData
Hi developers, After some investigation I have found there can be large discrepancies in the same object being saved as an external "xx.RData" file. The immediate repercussion of this is the possible increased size of your .RData workspace for no apparent reason. The function and its three scenarios below highlight these discrepancies. Note that the object being returned is exactly
2006 May 12
4
bitwise addition
Hello all again, I want to do bitwise addition in R. I am trying to generate a matrix 0000 0001 0010 .... .... 1111 I know the other ways of generating this matrix but I need to look at bitwise addition. Any suggestions??? thanks a lot Nameeta ------------------------------------------------- This email is intended only for the use of the individual or...{{dropped}}
2007 Jan 17
3
R.oo Destructors
Has anyone figured out how to create a destructor in R.oo? How I'd like to use it: I have an object which opens a connection thru RODBC (held as a private member) It would be nice if the connection closes automatically (inside the destructor) when an object gets gc()'ed. Thanks in advance. Regards, Ken BTW, a >BIG< thanks to Henrik Bengtsson for creating the R.oo package! Lucky
2000 Feb 11
1
astonishing memory phenomenon
I have a question concerning memory. I understood that R takes a fixed amount of memory at startup (which I can influence with --vsize --nsize) and that gc() shows the memory still free of the total memory reserved for R. However, if I create a long vector of character data, gc() only seem to reflect the space needed for a vector of pointers to char, the space used for the character data itself
2007 Jun 26
1
Memory Experimentation: Rule of Thumb = 10-15 Times the Memory
dear R experts: I am of course no R experts, but use it regularly. I thought I would share some experimentation with memory use. I run a linux machine with about 4GB of memory, and R 2.5.0. upon startup, gc() reports used (Mb) gc trigger (Mb) max used (Mb) Ncells 268755 14.4 407500 21.8 350000 18.7 Vcells 139137 1.1 786432 6.0 444750 3.4 This is my baseline. linux
2002 Oct 14
1
R 1.6.0 Solaris crash with xmalloc: out of virtual memory
[some de-capitalization of *SXP done manually by mailing list maintainer ; the originally was caught as potential spam. MM] I have a little R program that crashes with the message xmalloc: out of virtual memory The code has a repeat{} loop that watches the sizes of some files. When there's an increase it updates things by reading the last 65 lines of each file, doing some
2004 Aug 18
1
Memory Problems in R
Hello everyone - I have a couple of questions about memory management of large objects. Thanks in advance for your response. I'm running R version 1.9.1 on solaris 8, compiled as a 32 bit app. My system has 12.0 GB of memory, with usually ~ 11GB free. I checked system limits using ulimit, and there is nothing set that would limit the maximum amount of memory for a process (with the
2006 May 22
1
win2k memory problem with merge()'ing repeatedly (long email)
Good afternoon, I have a 63 small .csv files which I process daily, and until two weeks ago they processed just fine and only took a matter of moments and had non noticeable memory problem. Two weeks ago they have reached 318 lines and my script "broke". There are some missing-values in some of the files. I have tried hard many times over the last two weeks to create a
2007 Aug 09
1
Memory Experimentation: Rule of Thumb = 10-15 Times the Memory
Hi, I've been having similar experiences and haven't been able to substantially improve the efficiency using the guidance in the I/O Manual. Could anyone advise on how to improve the following scan()? It is not based on my real file, please assume that I do need to read in characters, and can't do any pre-processing of the file, etc. ## Create Sample File
2005 Dec 14
2
The fastest way to select and execute a few selected functions inside a function
Dear useRs? I have the following problem! I have a function that calls one or more functions, depending on the input parameters. I am searching for the fastest way to select and execute the selected functions and return their results in a list. The number of possible functions is 10, however usually only 2 are selected (although sometimes more, even all). For examples, if I have function
2005 Jan 14
1
S3/S4 classes performance comparison
Hi R-devel, If you did read my survey on Rhelp about reporting, you may have seen that I am implementing a way to handle outputs for R (mainly target output destinations: xHTML and TeX). In fact: I does have something that works for basic objects, entirely done with S4 classes, with the results visible at: http://www.stat.ucl.ac.be/ROMA/sample.htm http://www.stat.ucl.ac.be/ROMA/sample.pdf To
2006 May 16
2
Large database help
Hello all. I have a large .txt file whose variables are fixed-columns, ie, variable V1 goes from columns 1 to 7, V2 from 8 to 23 etc. This is a 60GB file with 90 variables and 60 million observations. I'm working with a Pentium 4, 1GB RAM, Windows XP Pro. I tried the following code just to see if I could work with 2 variables but it seems not possible: R : Copyright 2005, The R Foundation
2002 Apr 29
1
Garbage collection: RW1041
Have searched through the archives but have been unable to find any related issues - hopefully I'm not bringing up an old topic. Am using RW1041 on a Windows NT on a machine with 1Gb of memory. Have a function doit() that reads in a chunk of data using readBin, performs a regression, saves out coeffs and then returns. When using Rgui with the default memory limit of 256Mb I'm able to
2004 Jul 03
4
counting the occurrences of vectors
Hi: I have two matrices, A and B, where A is n x k, and B is m x k, where n >> m >> k. Is there a computationally fast way to count the number of times each row (a k-vector) of B occurs in A? Thanks for any suggestions. Best, Ravi. [[alternative HTML version deleted]]
2006 Nov 06
2
gc()$Vcells < 0 (PR#9345)
Full_Name: Don Maszle Version: 2.3.0 OS: x86_64-unknown-linux-gnu Submission from: (NULL) (206.86.87.3) # On our new 32 GB x86_64 machine R : Copyright 2006, The R Foundation for Statistical Computing Version 2.3.0 (2006-04-24) ISBN 3-900051-07-0 R is free software and comes with ABSOLUTELY NO WARRANTY. You are welcome to redistribute it under certain conditions. Type 'license()' or
2005 Nov 15
1
cannot.allocate.memory.again and 32bit<--->64bit
hello! ------ i use 32bit.Linux(SuSe)Server, so i'm limited with 3.5Gb of memory i demonstrate, that there is times to times a problem with allocating of objects of large size, for example 0.state (no objects yet created) ------------------------------------ > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 162070 4.4 350000 9.4 350000