similar to: cannot increase memory size to 4Gb (PR#11087)

Displaying 20 results from an estimated 600 matches similar to: "cannot increase memory size to 4Gb (PR#11087)"

2009 Jun 15
3
lack of memory for logistic regression in R?
Hi all, I am getting the following error message: > mymodel = glm(response ~ . , family=binomial, data=C); Error: cannot allocate vector of size 734.2 Mb In addition: Warning messages: 1: In array(0, c(n, n), list(levs, levs)) : Reached total allocation of 1535Mb: see help(memory.size) 2: In array(0, c(n, n), list(levs, levs)) : Reached total allocation of 1535Mb: see help(memory.size) 3:
2004 Dec 09
2
a question about swap space, memory and read.table()
Hi all Two computers: one is my desktop PC, windows2000, R 1.9.1. Physical RAM 256MB, Swap (Virtual memory) 384Mb. When I allocate a large matrix, it firstly uses up RAM, then use swap space. In windows' task manager, the usage of memory could exceed my physic RAM's size. The other machine is a remote server. Windows XP, R 1.9.1 Physical RAM 2GB. Swap space 4GB. I use "R
2007 May 07
4
Mardia's multivariate normality test
Dear all, I got this error message > library(dprep) > mardia(Savg) Error in cov(data) : 'x' is empty But with the same data, I got > library(mvnormtest) > mshapiro.test(Savg) Shapiro-Wilk normality test data: Z W = 0.9411, p-value = 0.6739 What does the error message "Error in cov(data) : 'x' is empty" mean? Thanks a lot! Jiao
2006 Jul 16
1
compiz developers
Hi Colin Gutrie: Thanks for your repply about alt-gr problem. And great!!!! I'm also a developer (opengl developer), so I very happy to find a developer list of compiz...I'm working in some ideas for pluggins (for now only in a opengl format without nothing about compiz or xgl), so I very interesting in work with you, is this posible??, please guide me on this...what I have to do to set
2010 Nov 03
2
memory allocation problem
Hi R users I am trying to run a non linear parameter optimization using the function optim() and I have problems regarding memory allocation. My data are in a dataframe with 9 columns. There are 656100 rows. >head(org_results) comb.id p H1 H2 Range Rep no.steps dist aver.hab.amount 1 1 0.1 0 0 1 100 0 0.2528321
2006 Jul 11
1
Patch for Java app and Compiz
Hi to all: My name is Mauricio and I write from Chili, I new on the list, so hi to all. I'm using Xgl and compiz and is an amazing technologie, also even as a "alpha" software work allmost perfect on my inspiron 6400-ATIX1300. I only have 2 problems. The first related to some java app and compiz (like limewire), so I read on the list something about a patch to try to solve this, I
2009 Apr 26
6
Memory issues in R
How do people deal with R and memory issues? I have tried using gc() to see how much memory is used at each step. Scanned Crawley R-Book and all other R books I have available and the FAQ on-line but no help really found. Running WinXP Pro (32 bit) with 4 GB RAM. One SATA drive pair is in RAID 0 configuration with 10000 MB allocated as virtual memory. I do have another machine
2009 May 30
1
A problem about "nlminb"
Hello everyone! When I use "nlminb" to minimize a function with a variable of almost 200,000 dimension, I got the following error. > nlminb(start=start0, msLE2, control = list(x.tol = .001)) Error in vector("double", length) : vector size specified is too large I had the following setting options(expressions=60000) options(object.size=10^15) I have no idea about what
2010 Aug 21
2
vector allocation error
I am running and analysis of sequencing data uisng the EdgeR package. I have received the following error: Using grid search to estimate tagwise dispersion. Error: cannot allocate vector of size 307.3 Mb indicating the memory allocation is too small. How would I change this confiuration in R so that the script can run with the files I have? Help appreciated, Josquin -- View this message in
2010 Aug 17
1
TM Package - Corpus function - Memory Allocation Problems
I'm using R 2.11.1 on Win XP (32-bit) with 3 GB of RAM. My data has (only) 16.0 MB. I want to create a VCorpus object using the Corpus function in the tm package but I'm running into Memory allocation issues: "Error: cannot allocate vector of size 372 Kb". My data is stored in a csv file which I've imported with "read.csv" and then used the following to create
2011 Mar 22
4
memory increasing
Dear All, I am an Italian researcher in Economics. I work with large sample data. I need to increase the memory in R-project in order to upload a file ".dta". How can I do this? Thank you. graziella -- View this message in context: http://r.789695.n4.nabble.com/memory-increasing-tp3396511p3396511.html Sent from the R help mailing list archive at Nabble.com.
2009 Sep 28
1
Windows Laptop specification query
I've read some postings back in 2002/2006 about running R on multiple core CPUs. The answer was basically separate processes work fine, but parallelization needs to be implemented using snow/rmpi. Are the answers still the same? I ask because we are about to order a laptop running Windows for a new staff member. Some advice on the following would be helpful. It will be ordered with Vista,
2010 Nov 19
2
help
Hola, Tengo la base de datos de un censo agropecuario el cual está en formato de SPSS, cuando cuando ejecuto el comando para introducirlo a R, despues de permanecer lento R, aparece esto: Error: cannot allocate vector of size 1.6 Mb Hasta aquí llega y no introduce ninguna observación. Mi pregunta es si existe alguna solución, talvez aumentar la memoria de R?? Imagino que como ustedes han
2007 May 21
1
size limit in R?
Hi, Please see the email exchanges below. I am having trouble generating output that is large enough for our needs, specifically when using the GaussRF function. However, when I wrote Dr. Schlather (the author of the GaussRF function), he indicated that there is also a limit imposed by R itself. Is this something that we can overcome? Thank you very much for any assistance you may provde.
2008 Sep 02
2
receiving "Error: cannot allocate vector of size 1.5 Gb"
Dear all, In my attempt to run the below modelling command in R 2.7.0 under windows XP (4GB RAM with /3GB switch set) I receive the following error: Error: cannot allocate vector of size 1.5 Gb I have searched a bit and have tried adding: --max-mem-size=3071M to the command line (when set to 3G I get the error that 3072M is too much) I also run: > memory.size() [1] 11.26125 >
2009 Nov 19
1
advice about R for windows speed
Dear All, I appreciate any advice or hints you could provide about the following. We are running R code in a server (running Windows XP and QuadCore Xeon processors, see details below) and we would like to use the server efficiently. Our code takes a bit more than 6 seconds per 25 iterations in the server using a default R 2.10.0 installation. We tested our code in two other computers, a Dell
2012 May 20
4
R Memory Issues
---------- Forwarded message ---------- From: Emiliano Zapata <ezapataika@gmail.com> Date: Sun, May 20, 2012 at 12:09 PM Subject: To: R-help@r-project.org Hi, I have a 64 bits machine (Windows) with a total of 192GB of physical memory (RAM), and total of 8 CPU. I wanted to ask how can I make R make use of all the memory. I recently ran a script requiring approximately 92 GB of memory to
2012 Apr 10
2
Error: cannot allocate vector of size...
Hello: While running R doing the analysis of my data I (using packages such as BIOMOD or e1071) get the following error as a result of several of my analysis: Error: cannot allocate vector of size 998.5 Mb In addition: Warning messages: 1: In array(c(rep.int(c(1, numeric(n)), n - 1L), 1), d, dn) : Reached total allocation of 4095Mb: see help(memory.size) 2: In array(c(rep.int(c(1,
2009 Jan 09
7
Desperate question about MPXIO with ZFS-iSCSI
I''m trying to set up a iscsi connection (with MPXIO) between my Vista64 workstation and a ZFS storage machine running OpenSolaris 10 (forget the exact version). On the ZFS machines, I have two NICS. NIC #1 is 192.168.1.102, and NIC #2 is 192.168.2.102. The NICs are connected to two separate switches serving two separate IP spaces. On my Vista64 machine, I also have two NICs connected in
2004 May 27
2
Tape drive problems
Hi, Hi, I have been googling, and can't find anything that will help me: mt -f /dev/st0 status /dev/st0: No such device or address Any suggestion? I am using CentOS 3 [RHEL ES3 without the licenses] The tape drive is recognized at boot [from dmesg] scsi0 : Adaptec AIC7XXX EISA/VLB/PCI SCSI HBA DRIVER, Rev 6.2.36 <Adaptec 29160 Ultra160 SCSI adapter> aic7892: