Displaying 20 results from an estimated 8000 matches similar to: "Problems in batch mode"
1999 Apr 27
2
Memory management
Dear all,
I don't get it:
First of all, the help doesn't say what are the memory limits of
R. Say, what's the max heap size for instance ????
Secondly, I invoke R with the following commands each time:
rgui --vsize 30M --nsize 1000K
rgui --vsize 30M --nsize 2000K
rgui --vsize 30M --nsize 3000K
rgui --vsize 30M --nsize 4000K
I try to open a matrix 8000x8000 by issuing
2002 Apr 12
1
Problems with memory
Dear all,
I've started working with R (vs 1041) a few weeks ago, and now I'm
having problems with the amount of memory.
I'm working on the windows-me, my computer has 128 Mb of memory. I'm
using the R under the emacs (ESS-5.1.20) and it is started by the
command:
Rterm --min-vsize=10M --max-vsize=100M --min-nsize=500k --max-nsize=1M
I've been had problems when executing a
2010 Jan 19
2
Server hanging despite efforts to correct memory limits
My group is working with datasets between 100 Mb and 1 GB in size, using
multiple log ins. From the documentation, it appears that vsize is limited
to 2^30-1, which tends to prove too restrictive for our use. When we drop
that restriction (set vsize = NA) we end up hanging the server, which
requires a restart. Is there any way to increase the memory limits on R
while keeping our jobs from
1999 Nov 12
1
R-0.65.1 Startup
Dear R users,
I have noticed that my R startup is extremely slow. It takes almost 3
minutes from "double-click" to R prompt. I have been running R-0.64.1 till
recently and it took about 30 sec. I still have access to R-0.64.1. When I
started it up, it took about 25 sec. Can anyone tell me if this is a bug in
R or a problem with my machine?
Note: This is after bootup with R being the
2001 Aug 22
1
Huge workspace cannot be opened
Hi everyone,
I have a problem that some people may have already encountered but i did not
find the solution yet.
As I use R to simulate several arrays of data, my workspace is now 35Mb big and
I cannot launch R with it.
An "xdr real data read error occured" and R tells me to delete .RData or
increase memory. I WON'T delete this file and changing the max-nsize to 40600k
did not
2009 Nov 30
1
allocating vector memory > 1 GByte on Windows XP / Vista / 7
Let me begin stating that I read all help files and faq's on the subject
matter (there aren't more than about a dozen) but either did not find
solutions or found them not to work.
Here is the issue. I am trying to run a spatial regression on a
medium-sized dataset. Part of the functions in the spdep package I use
require me to allocate a vector of 1.1 Gb (mine is not a spatial SIG
2005 Jun 29
3
Memory Management under Linux: Problems to allocate large amounts of data
Dear Group
I'm still trying to bring many data into R (see older postings). After solving some troubles with the database I do most of the work in MySQL. But still I could be nice to work on some data using R. Therefore I can use a dedicated Server with Gentoo Linux as OS hosting only R. This Server is a nice machine with two CPU and 4GB RAM which should do the job:
Dual Intel XEON 3.06 GHz
1999 May 15
2
vsize and nsize
I am running R version ??? under Redhat 5.2. It seems as though the
--nsize object has no effct on the size of the allocated Ncells as
determined using gc(). Yes, I have that much data....
That is if I envoke R with
R --vsize 100 --nsize 5000000
then type
gc()
I get
free total
Ncells 92202 200000
Vcells 12928414 13107200
Thanks
Tony Long
Ecology and Evolutionary Biology
Steinhaus
2009 May 07
1
increasing memory for R bg job
Hi,
Is the following command used to increase the memory or any other command when a background R job is run?
R --min-vsize=vl --max-vsize=vu --min-nsize=nl --max-nsize=nu --max-ppsize=N
source:
http://stat.ethz.ch/R-manual/R-patched/library/base/html/Memory.html
Thx
Carol
[[alternative HTML version deleted]]
1999 Jun 15
2
ESS and R
For anybody who uses ESS with R, how do you invoke the vsize and nsize options
when you call R. I can't find any appropriate variables from an apropos.
Thanks,
Jord
--
Jordan Howarth CSIRO Mathematical and Information Sciences
mailto:jordan.howarth at cmis.csiro.au
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read
2000 Nov 09
3
maximum of nsize=20000k ??
Dear R-ers,
somehow it is not possible to increase nsize to more than
20000k. When I specify e.g.
> R --vsize=10M --nsize=21000K
the result is:
free total (Mb)
Ncells 99658 350000 6.7
Vcells 1219173 1310720 10.0
Maybe I have overlooked s.th....
Marcus
--
+-------------------------------------------------------
| Marcus Eger
| E-Mail: eger.m at gmx.de (NEW)
|
2000 Jan 23
1
size limits
Hi,
I have a few questions about how to handle large data sets in R.
What is the size of the largest matrix that R can comfortably deal with?
Is this size limit imposed by R's software, or is it a question
of the machine that one runs on?
How does one go about choosing reasonable values of vsize
and nsize?
I have a data set with about 1,000,000 rows, and 30
2004 Mar 08
2
memory problem
I am trying to upload into R 143 Affymetrix chips onto using R on the NIH
Nimbus server. I can load 10 chips without a problem, however, when I try
to load 143 I receive a error message: cannot create a vector of 523263 KB.
I have expanded the memory of R as follows: R --min-vsize=10M
--max-vsize=2500M --min-nsize=10M -max-nsize=50M (as specified in help in
R). After running this command the
2015 Jan 15
2
default min-v/nsize parameters
Just wanted to start a discussion on whether R could ship with more
appropriate GC parameters. Right now, loading the recommended package
Matrix leads to:
> library(Matrix)
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 1076796 57.6 1368491 73.1 1198505 64.1
Vcells 1671329 12.8 2685683 20.5 1932418 14.8
Results may vary, but here R needed 64MB of N cells and 15MB
2008 Feb 12
2
Cox model
Hello R-community,
It's been a week now that I am struggling with the implementation of a cox
model in R. I have 80 cancer patients, so 80 time measurements and 80
relapse or no measurements (respective to censor, 1 if relapsed over the
examined period, 0 if not). My microarray data contain around 18000 genes.
So I have the expressions of 18000 genes in each of the 80 tumors (matrix
2009 Jul 01
3
"Error: cannot allocate vector of size 332.3 Mb"
Dear R-helpers,
I am running R version 2.9.1 on a Mac Quad with 32Gb of RAM running
Mac OS X version 10.5.6. With over 20Gb of RAM "free" (according to
the Activity Monitor) the following happens.
> x <- matrix(rep(0, 6600^2), ncol = 6600)
# So far so good. But I need 3 matrices of this size.
> y <- matrix(rep(0, 6600^2), ncol = 6600)
R(3219) malloc: ***
2000 Jul 20
1
bad R bug
Hi,
I am not on this mailing list, but here is a terrible bug that has
stopped me in my tracks. I am unable to remove observations from a data
matrix.
temp is the original matrix. Notice that there are 288 entries with a
104 in the first column. I attempt to remove these entries, but R does
not do it.
brad
ACTUAL COMMANDS:
> dim(temp)
[1] 30528 11
> table(temp[,1])
1 3 4
2000 Oct 02
3
R vs S-PLUS with regard to memory usage
I am trying to translate code from S-PLUS to R and R really struggles!
After starting R with the foll.
R --vsize 50M --nsize 6M --no-restore
on a 400 MHz Pentium with 192 MB of memory running Linux (RH 6.2),
I run a function that essentially picks up an external dataset with 2121
rows
and 30 columns and builds a lm() object and also runs step() ... the step()
takes forever to run...(takes very
2000 Aug 17
2
R on os390
G'day R friends,
I didn't get any replies on the main list so I thought I'd try with the
experts.
I was wondering if anyone's ported R to os390. If so, are the vsize and
nsize limits the same as other platforms?
I could really annoy those SAS guys then.
thanks,
John Strumila
john.strumila@team.telstra.com
2001 Jul 16
2
Trouble with the memory allocation
Dear R-users,
I am currently facing what appears to be a strange thing (at least to my
humble understanding).
If I understood correctly, starting with the version 1.2.3, R memory
allocation can be done dynamically,
and there is no need to fiddle with the --nsize and --vsize parameter
any longer...
So far this everything seemed to go this way (I saw the size of my
processes growing when I was