Displaying 20 results from an estimated 2000 matches similar to: "No subject"
2000 Feb 09
0
No subject
I have the Windows 95 version of R.
How do I increase the memory of the vsize and nsize??
Can you please give me exact codes that I should enter.
When I enter the codes from the help manual ... they don't work. I believe they are for the Unix system only.
Any help would be very much appreciated.
Sincerely,
Dr. Athanasios Tom Koutsavlis
Montreal Public Health Department
--------------
2010 Jan 19
2
Server hanging despite efforts to correct memory limits
My group is working with datasets between 100 Mb and 1 GB in size, using
multiple log ins. From the documentation, it appears that vsize is limited
to 2^30-1, which tends to prove too restrictive for our use. When we drop
that restriction (set vsize = NA) we end up hanging the server, which
requires a restart. Is there any way to increase the memory limits on R
while keeping our jobs from
1999 Apr 27
2
Memory management
Dear all,
I don't get it:
First of all, the help doesn't say what are the memory limits of
R. Say, what's the max heap size for instance ????
Secondly, I invoke R with the following commands each time:
rgui --vsize 30M --nsize 1000K
rgui --vsize 30M --nsize 2000K
rgui --vsize 30M --nsize 3000K
rgui --vsize 30M --nsize 4000K
I try to open a matrix 8000x8000 by issuing
2001 Aug 22
1
Huge workspace cannot be opened
Hi everyone,
I have a problem that some people may have already encountered but i did not
find the solution yet.
As I use R to simulate several arrays of data, my workspace is now 35Mb big and
I cannot launch R with it.
An "xdr real data read error occured" and R tells me to delete .RData or
increase memory. I WON'T delete this file and changing the max-nsize to 40600k
did not
2000 Nov 09
3
maximum of nsize=20000k ??
Dear R-ers,
somehow it is not possible to increase nsize to more than
20000k. When I specify e.g.
> R --vsize=10M --nsize=21000K
the result is:
free total (Mb)
Ncells 99658 350000 6.7
Vcells 1219173 1310720 10.0
Maybe I have overlooked s.th....
Marcus
--
+-------------------------------------------------------
| Marcus Eger
| E-Mail: eger.m at gmx.de (NEW)
|
1999 May 15
2
vsize and nsize
I am running R version ??? under Redhat 5.2. It seems as though the
--nsize object has no effct on the size of the allocated Ncells as
determined using gc(). Yes, I have that much data....
That is if I envoke R with
R --vsize 100 --nsize 5000000
then type
gc()
I get
free total
Ncells 92202 200000
Vcells 12928414 13107200
Thanks
Tony Long
Ecology and Evolutionary Biology
Steinhaus
2005 Dec 20
1
Problems in batch mode
Dear R-users,
I am trying to run some simulations in batch mode. In an older version
of the program, I used
rterm --vsize=100M --nsize=5000K --restore --save <input file> output file,
however, in the new version R 2.2.0 , the parameters vsize and nsize are
ignored.
I can use the command memory.limit to increase memory, but I am not sure if
this corresponds to vsize and nsize.
2009 May 07
1
increasing memory for R bg job
Hi,
Is the following command used to increase the memory or any other command when a background R job is run?
R --min-vsize=vl --max-vsize=vu --min-nsize=nl --max-nsize=nu --max-ppsize=N
source:
http://stat.ethz.ch/R-manual/R-patched/library/base/html/Memory.html
Thx
Carol
[[alternative HTML version deleted]]
2009 Nov 30
1
allocating vector memory > 1 GByte on Windows XP / Vista / 7
Let me begin stating that I read all help files and faq's on the subject
matter (there aren't more than about a dozen) but either did not find
solutions or found them not to work.
Here is the issue. I am trying to run a spatial regression on a
medium-sized dataset. Part of the functions in the spdep package I use
require me to allocate a vector of 1.1 Gb (mine is not a spatial SIG
1999 Dec 17
1
R CMD check --help
This example from the INSTALL help seems to be broken in R 0.90.1 (on Solaris):
gilp/dse : R CMD check --help
Usage: R CMD check [options] [-l lib] pkg_1 ... pkg_n
I'm trying to figure out how to request more nsize and vsize when using R CMD
check.
Paul Gilbert
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-devel mailing list -- Read
2001 Mar 01
3
How do you expand memory capability (Was: R crashes in Windows ME)
Hello-
Since my data bank in SPSS has > 40 variables, I think that R crashes because of the memory limit.
In Maindonald?s UsingR text, on pg 3, there?s a footnote that reads:
"If you want larger memory space than the default you may want a target akin to
<path to binary>\rw091\bin\rgui.exe --visize 30M --nsize 1000K
[The default is --vsize 6M --nsize 250K
2004 Mar 08
2
memory problem
I am trying to upload into R 143 Affymetrix chips onto using R on the NIH
Nimbus server. I can load 10 chips without a problem, however, when I try
to load 143 I receive a error message: cannot create a vector of 523263 KB.
I have expanded the memory of R as follows: R --min-vsize=10M
--max-vsize=2500M --min-nsize=10M -max-nsize=50M (as specified in help in
R). After running this command the
2000 Jan 23
1
size limits
Hi,
I have a few questions about how to handle large data sets in R.
What is the size of the largest matrix that R can comfortably deal with?
Is this size limit imposed by R's software, or is it a question
of the machine that one runs on?
How does one go about choosing reasonable values of vsize
and nsize?
I have a data set with about 1,000,000 rows, and 30
2005 Jun 29
3
Memory Management under Linux: Problems to allocate large amounts of data
Dear Group
I'm still trying to bring many data into R (see older postings). After solving some troubles with the database I do most of the work in MySQL. But still I could be nice to work on some data using R. Therefore I can use a dedicated Server with Gentoo Linux as OS hosting only R. This Server is a nice machine with two CPU and 4GB RAM which should do the job:
Dual Intel XEON 3.06 GHz
1999 Oct 06
2
R --nsize 2M runs havoc (under linux)
Dear All,
I am running R version 0.65.0 under
a) Suse-Linux 6.1, and Suse-Linux 6.2, compiler gcc-2.95, CPUs pentium pro
200, 128MB, and pentium II 450, 128MB
b) Solaris 5.7, compiler gcc-2.95, cpu SUN sparc, 4000MB
When I set --nsize to more than 1M, R's internal storage management runs
havoc. gc() indicates the requested sizes, but the overall process size is
much too big: Running R with
2000 Apr 10
2
Increasing memory size in ESS
I am having a problem using ESS with R. In particular I have large
data objects which exceed the 6Mb default heap memory. Outside of
ESS I can run R by specifying large values of --vsize and --nsize
but I can't figure out how to do this in ESS.
Any help would be much appreciated.
--
****************************************************
** Angelo J. Canty **
2000 Aug 17
2
R on os390
G'day R friends,
I didn't get any replies on the main list so I thought I'd try with the
experts.
I was wondering if anyone's ported R to os390. If so, are the vsize and
nsize limits the same as other platforms?
I could really annoy those SAS guys then.
thanks,
John Strumila
john.strumila@team.telstra.com
1999 Apr 12
3
--nsize and --vsize
Martin M has suggested I widen this discussion to R-devel, and
> I agree that we should increase them,
> but I'm not sure at all about the amount.
>
> The default could even depend on the architecture (via "./configure")..
Views, please.
------------- Begin Forwarded Message -------------
Is is not time we increased the defaults a bit? As the base gets bigger
I hit
1999 Nov 12
1
R-0.65.1 Startup
Dear R users,
I have noticed that my R startup is extremely slow. It takes almost 3
minutes from "double-click" to R prompt. I have been running R-0.64.1 till
recently and it took about 30 sec. I still have access to R-0.64.1. When I
started it up, it took about 25 sec. Can anyone tell me if this is a bug in
R or a problem with my machine?
Note: This is after bootup with R being the
2009 Jul 01
3
"Error: cannot allocate vector of size 332.3 Mb"
Dear R-helpers,
I am running R version 2.9.1 on a Mac Quad with 32Gb of RAM running
Mac OS X version 10.5.6. With over 20Gb of RAM "free" (according to
the Activity Monitor) the following happens.
> x <- matrix(rep(0, 6600^2), ncol = 6600)
# So far so good. But I need 3 matrices of this size.
> y <- matrix(rep(0, 6600^2), ncol = 6600)
R(3219) malloc: ***