Displaying 18 results from an estimated 18 matches for "uses_0021".
2009 Jun 15
3
lack of memory for logistic regression in R?
Hi all,
I am getting the following error message:
> mymodel = glm(response ~ . , family=binomial, data=C);
Error: cannot allocate vector of size 734.2 Mb
In addition: Warning messages:
1: In array(0, c(n, n), list(levs, levs)) :
Reached total allocation of 1535Mb: see help(memory.size)
2: In array(0, c(n, n), list(levs, levs)) :
Reached total allocation of 1535Mb: see help(memory.size)
3:
2004 Dec 09
2
a question about swap space, memory and read.table()
Hi all
Two computers:
one is my desktop PC, windows2000, R 1.9.1. Physical RAM 256MB, Swap
(Virtual memory) 384Mb. When I allocate a large matrix, it firstly
uses up RAM, then use swap space. In windows' task manager, the usage
of memory could exceed my physic RAM's size.
The other machine is a remote server. Windows XP, R 1.9.1 Physical RAM 2GB.
Swap space 4GB. I use "R
2007 May 07
4
Mardia's multivariate normality test
Dear all,
I got this error message
> library(dprep)
> mardia(Savg)
Error in cov(data) : 'x' is empty
But with the same data, I got
> library(mvnormtest)
> mshapiro.test(Savg)
Shapiro-Wilk normality test
data: Z
W = 0.9411, p-value = 0.6739
What does the error message "Error in cov(data) : 'x' is empty" mean? Thanks a lot!
Jiao
2010 Nov 03
2
memory allocation problem
Hi R users
I am trying to run a non linear parameter optimization using the
function optim() and I have problems regarding memory allocation.
My data are in a dataframe with 9 columns. There are 656100 rows.
>head(org_results)
comb.id p H1 H2 Range Rep no.steps dist aver.hab.amount
1 1 0.1 0 0 1 100 0
0.2528321
2009 Apr 26
6
Memory issues in R
How do people deal with R and memory issues?
I have tried using gc() to see how much memory is used at each step.
Scanned Crawley R-Book and all other R books I have available and the FAQ
on-line but no help really found.
Running WinXP Pro (32 bit) with 4 GB RAM.
One SATA drive pair is in RAID 0 configuration with 10000 MB allocated as
virtual memory.
I do have another machine
2009 May 30
1
A problem about "nlminb"
Hello everyone!
When I use "nlminb" to minimize a function with a variable of almost 200,000
dimension, I got the following error.
> nlminb(start=start0, msLE2, control = list(x.tol = .001))
Error in vector("double", length) : vector size specified is too large
I had the following setting
options(expressions=60000)
options(object.size=10^15)
I have no idea about what
2010 Aug 21
2
vector allocation error
I am running and analysis of sequencing data uisng the EdgeR package. I have
received the following error:
Using grid search to estimate tagwise dispersion. Error: cannot allocate
vector of size 307.3 Mb
indicating the memory allocation is too small. How would I change this
confiuration in R so that the script can run with the files I have?
Help appreciated, Josquin
--
View this message in
2008 Apr 04
1
cannot increase memory size to 4Gb (PR#11087)
Full_Name: Nick Henriquez
Version: 2.6.2; 2.6.1 and 2.4.1
OS: Vista business 64
Submission from: (NULL) (144.82.49.16)
I try to increase memory size (by copy-pasting from FAQ) --max-mem-size=1Gb at
the end of the Target field (after any final double quote, and separated by a
space) and changing the 1 to a 4.
After double-clicking the shortcut (for modified 2.4.1 OR 2.6.1 OR 2.6.2) I get
2010 Aug 17
1
TM Package - Corpus function - Memory Allocation Problems
I'm using R 2.11.1 on Win XP (32-bit) with 3 GB of RAM. My data has
(only) 16.0 MB.
I want to create a VCorpus object using the Corpus function in the tm
package but I'm running into Memory allocation issues: "Error: cannot
allocate vector of size 372 Kb".
My data is stored in a csv file which I've imported with "read.csv" and
then used the following to create
2011 Mar 22
4
memory increasing
Dear All,
I am an Italian researcher in Economics. I work with large sample data. I
need to increase the memory in R-project in order to upload a file ".dta".
How can I do this?
Thank you.
graziella
--
View this message in context: http://r.789695.n4.nabble.com/memory-increasing-tp3396511p3396511.html
Sent from the R help mailing list archive at Nabble.com.
2009 Sep 28
1
Windows Laptop specification query
I've read some postings back in 2002/2006 about running R on multiple
core CPUs. The answer was basically separate processes work fine, but
parallelization needs to be implemented using snow/rmpi. Are the answers
still the same?
I ask because we are about to order a laptop running Windows for a new
staff member. Some advice on the following would be helpful.
It will be ordered with Vista,
2010 Nov 19
2
help
Hola,
Tengo la base de datos de un censo agropecuario el cual está en
formato de SPSS, cuando cuando ejecuto el comando para introducirlo a
R, despues de permanecer lento R, aparece esto:
Error: cannot allocate vector of size 1.6 Mb
Hasta aquí llega y no introduce ninguna observación. Mi pregunta es si
existe alguna solución, talvez aumentar la memoria de R?? Imagino que
como ustedes han
2007 May 21
1
size limit in R?
Hi,
Please see the email exchanges below. I am having trouble generating output that is large enough
for our needs, specifically when using the GaussRF function. However, when I wrote Dr. Schlather
(the author of the GaussRF function), he indicated that there is also a limit imposed by R itself.
Is this something that we can overcome?
Thank you very much for any assistance you may provde.
2008 Sep 02
2
receiving "Error: cannot allocate vector of size 1.5 Gb"
Dear all,
In my attempt to run the below modelling command in R 2.7.0 under windows XP (4GB RAM with /3GB switch set) I receive the following error:
Error: cannot allocate vector of size 1.5 Gb
I have searched a bit and have tried adding: --max-mem-size=3071M to the command line (when set to 3G I get the error that 3072M is too much)
I also run:
> memory.size()
[1] 11.26125
>
2009 Nov 19
1
advice about R for windows speed
Dear All,
I appreciate any advice or hints you could provide about the following.
We are running R code in a server (running Windows XP and QuadCore Xeon
processors, see details below) and we would like to use the server
efficiently. Our code takes a bit more than 6 seconds per 25 iterations in
the server using a default R 2.10.0 installation.
We tested our code in two other computers, a Dell
2012 May 20
4
R Memory Issues
---------- Forwarded message ----------
From: Emiliano Zapata <ezapataika@gmail.com>
Date: Sun, May 20, 2012 at 12:09 PM
Subject:
To: R-help@r-project.org
Hi,
I have a 64 bits machine (Windows) with a total of 192GB of physical memory
(RAM), and total of 8 CPU. I wanted to ask how can I make R make use of all
the memory. I recently ran a script requiring approximately 92 GB of memory
to
2006 Jan 05
4
Q: R 2.2.1: Memory Management Issues?
...ult on Linux systems
and may allow some larger programs to run without crashes. ...
------------------------------------------------------------------------------------
and also from the Windows FAQ [http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-uses_0021]:
------------------------------------------------------------------------------------
2.9 There seems to be a limit on the memory it uses!
Indeed there is. It is set by the command-line flag --max-mem-size (see How do I install R for Windows?) and defaults to the smaller of the amount of physical...
2009 Jun 15
0
books on Time serie
...emory. Thank you.
>>>
>>> It's certainly not too large for R. Have you looked at the R
>>> Windows FAQ
>>> on the topic?
>>>
>>>
>>> http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-uses_0021
>>>
>>> ... and perhaps:
>>>
>>> http://finzi.psych.upenn.edu/Rhelp08/2008-August/171649.html
>>>
>>>
>>> David Winsemius, MD
>>> Heritage Laboratories
>>> West Hartford, CT
>>>
>>> __________________...