Displaying 20 results from an estimated 3000 matches similar to: "R-0.65.1 Startup"
1999 May 15
2
vsize and nsize
I am running R version ??? under Redhat 5.2. It seems as though the
--nsize object has no effct on the size of the allocated Ncells as
determined using gc(). Yes, I have that much data....
That is if I envoke R with
R --vsize 100 --nsize 5000000
then type
gc()
I get
free total
Ncells 92202 200000
Vcells 12928414 13107200
Thanks
Tony Long
Ecology and Evolutionary Biology
Steinhaus
1999 Jul 23
2
rw0642
Among other computers, I am using rw0642 on an IBM 300GL with 32MB RAM and
Windows98.
1. If after opening rw0642 the first command is to help such as "? par",
when the help window is closed by clicking X in the upper right corner of
the help window the following message is shown
This program has performed an illegal operation and will be shut down.
If the problem persists contact the
2010 Jan 19
2
Server hanging despite efforts to correct memory limits
My group is working with datasets between 100 Mb and 1 GB in size, using
multiple log ins. From the documentation, it appears that vsize is limited
to 2^30-1, which tends to prove too restrictive for our use. When we drop
that restriction (set vsize = NA) we end up hanging the server, which
requires a restart. Is there any way to increase the memory limits on R
while keeping our jobs from
2000 Nov 09
3
maximum of nsize=20000k ??
Dear R-ers,
somehow it is not possible to increase nsize to more than
20000k. When I specify e.g.
> R --vsize=10M --nsize=21000K
the result is:
free total (Mb)
Ncells 99658 350000 6.7
Vcells 1219173 1310720 10.0
Maybe I have overlooked s.th....
Marcus
--
+-------------------------------------------------------
| Marcus Eger
| E-Mail: eger.m at gmx.de (NEW)
|
1999 Apr 27
2
Memory management
Dear all,
I don't get it:
First of all, the help doesn't say what are the memory limits of
R. Say, what's the max heap size for instance ????
Secondly, I invoke R with the following commands each time:
rgui --vsize 30M --nsize 1000K
rgui --vsize 30M --nsize 2000K
rgui --vsize 30M --nsize 3000K
rgui --vsize 30M --nsize 4000K
I try to open a matrix 8000x8000 by issuing
2000 Oct 02
3
R vs S-PLUS with regard to memory usage
I am trying to translate code from S-PLUS to R and R really struggles!
After starting R with the foll.
R --vsize 50M --nsize 6M --no-restore
on a 400 MHz Pentium with 192 MB of memory running Linux (RH 6.2),
I run a function that essentially picks up an external dataset with 2121
rows
and 30 columns and builds a lm() object and also runs step() ... the step()
takes forever to run...(takes very
1999 Apr 12
3
--nsize and --vsize
Martin M has suggested I widen this discussion to R-devel, and
> I agree that we should increase them,
> but I'm not sure at all about the amount.
>
> The default could even depend on the architecture (via "./configure")..
Views, please.
------------- Begin Forwarded Message -------------
Is is not time we increased the defaults a bit? As the base gets bigger
I hit
2009 May 07
1
increasing memory for R bg job
Hi,
Is the following command used to increase the memory or any other command when a background R job is run?
R --min-vsize=vl --max-vsize=vu --min-nsize=nl --max-nsize=nu --max-ppsize=N
source:
http://stat.ethz.ch/R-manual/R-patched/library/base/html/Memory.html
Thx
Carol
[[alternative HTML version deleted]]
2000 Aug 17
2
R on os390
G'day R friends,
I didn't get any replies on the main list so I thought I'd try with the
experts.
I was wondering if anyone's ported R to os390. If so, are the vsize and
nsize limits the same as other platforms?
I could really annoy those SAS guys then.
thanks,
John Strumila
john.strumila@team.telstra.com
2001 Aug 22
1
Huge workspace cannot be opened
Hi everyone,
I have a problem that some people may have already encountered but i did not
find the solution yet.
As I use R to simulate several arrays of data, my workspace is now 35Mb big and
I cannot launch R with it.
An "xdr real data read error occured" and R tells me to delete .RData or
increase memory. I WON'T delete this file and changing the max-nsize to 40600k
did not
2009 Nov 30
1
allocating vector memory > 1 GByte on Windows XP / Vista / 7
Let me begin stating that I read all help files and faq's on the subject
matter (there aren't more than about a dozen) but either did not find
solutions or found them not to work.
Here is the issue. I am trying to run a spatial regression on a
medium-sized dataset. Part of the functions in the spdep package I use
require me to allocate a vector of 1.1 Gb (mine is not a spatial SIG
1999 Jun 15
2
ESS and R
For anybody who uses ESS with R, how do you invoke the vsize and nsize options
when you call R. I can't find any appropriate variables from an apropos.
Thanks,
Jord
--
Jordan Howarth CSIRO Mathematical and Information Sciences
mailto:jordan.howarth at cmis.csiro.au
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read
2001 Mar 12
4
1.2.2 under M$ windows 2000 lots of plots out of memory?
hi-
If I source the following
for(k in seq(1:20)){
x<-runif(20000,min=-500,max=2000)
y<-runif(20000,min=-500,max=2500)
z<-runif(20000,min=-10,max=10)
cat(k,"file",memory.size())
cc<-rainbow(11)
plot(x,y,asp=1i,xlim=c(-500,2000),ylim=c(-500,2500),main=k,cex=1.0)
for(i in seq(-10,10,2)){
points(x[z > i],y[z > i],col=cc[(12+i)/2],cex=1.0)
}
rm(x,y,z)
2005 Dec 20
1
Problems in batch mode
Dear R-users,
I am trying to run some simulations in batch mode. In an older version
of the program, I used
rterm --vsize=100M --nsize=5000K --restore --save <input file> output file,
however, in the new version R 2.2.0 , the parameters vsize and nsize are
ignored.
I can use the command memory.limit to increase memory, but I am not sure if
this corresponds to vsize and nsize.
2005 Jun 29
3
Memory Management under Linux: Problems to allocate large amounts of data
Dear Group
I'm still trying to bring many data into R (see older postings). After solving some troubles with the database I do most of the work in MySQL. But still I could be nice to work on some data using R. Therefore I can use a dedicated Server with Gentoo Linux as OS hosting only R. This Server is a nice machine with two CPU and 4GB RAM which should do the job:
Dual Intel XEON 3.06 GHz
2002 Jun 25
1
commandArgs: feature request
Dear R-core Team,
As Thomas Lumley pointed out in one of his e-mails one can use commandArgs()
to get a copy of the command line arguments supplied when R session was
invoked and then use grep to extract parameters of interest.
His solution works very well if the custom options are passed by names, e.g.
--my-option=value, but what if one wants to pass parameters by their
positions. Then it's
2001 Jan 14
2
Help
Dear sir,
I am using R in windows. I want to extend R Memory
size.
I use the following command, but unfortunately it
doesn't work.
-- vsize=15M --nsize=1000K
Your help is appreciated.
Thanks,
Esmail Amiri.
__________________________________________________
Do You Yahoo!?
Get email at your own domain with Yahoo! Mail.
http://personal.mail.yahoo.com/
2004 Mar 08
2
memory problem
I am trying to upload into R 143 Affymetrix chips onto using R on the NIH
Nimbus server. I can load 10 chips without a problem, however, when I try
to load 143 I receive a error message: cannot create a vector of 523263 KB.
I have expanded the memory of R as follows: R --min-vsize=10M
--max-vsize=2500M --min-nsize=10M -max-nsize=50M (as specified in help in
R). After running this command the
2001 Mar 01
3
How do you expand memory capability (Was: R crashes in Windows ME)
Hello-
Since my data bank in SPSS has > 40 variables, I think that R crashes because of the memory limit.
In Maindonald?s UsingR text, on pg 3, there?s a footnote that reads:
"If you want larger memory space than the default you may want a target akin to
<path to binary>\rw091\bin\rgui.exe --visize 30M --nsize 1000K
[The default is --vsize 6M --nsize 250K
2009 Jul 01
3
"Error: cannot allocate vector of size 332.3 Mb"
Dear R-helpers,
I am running R version 2.9.1 on a Mac Quad with 32Gb of RAM running
Mac OS X version 10.5.6. With over 20Gb of RAM "free" (according to
the Activity Monitor) the following happens.
> x <- matrix(rep(0, 6600^2), ncol = 6600)
# So far so good. But I need 3 matrices of this size.
> y <- matrix(rep(0, 6600^2), ncol = 6600)
R(3219) malloc: ***