Displaying 20 results from an estimated 190 matches for "nsizes".
Did you mean:
sizes
2000 Nov 09
3
maximum of nsize=20000k ??
Dear R-ers,
somehow it is not possible to increase nsize to more than
20000k. When I specify e.g.
> R --vsize=10M --nsize=21000K
the result is:
free total (Mb)
Ncells 99658 350000 6.7
Vcells 1219173 1310720 10.0
Maybe I have overlooked s.th....
Marcus
--
+-------------------------------------------------------
| Marcus Eger
| E-Mail: eger.m at gmx.de (NEW)
|
1999 Nov 12
1
R-0.65.1 Startup
Dear R users,
I have noticed that my R startup is extremely slow. It takes almost 3
minutes from "double-click" to R prompt. I have been running R-0.64.1 till
recently and it took about 30 sec. I still have access to R-0.64.1. When I
started it up, it took about 25 sec. Can anyone tell me if this is a bug in
R or a problem with my machine?
Note: This is after bootup with R being the
2007 Oct 28
1
tree problem
I am trying to use tree to partition a data set. The data set has 3924
observations. Partitioning seems to work for small subsets of the data,
but when I use the entire data set, no partitioning occurs. The
variables are:
RESP respondent to a survey (0 = not a respondent, 1 =
respondent)
AGE_P Age (continuous)
ORIGIN_I Hispanic Ethnicity (1 = Hispanic, 2 = non-Hispanic)
RACRECI2 Race
2017 Nov 22
2
function pointers?
We have a project that calls for the creation of a list of many
distribution objects. Distributions can be of various types, with
various parameters, but we ran into some problems. I started testing
on a simple list of rnorm-based objects.
I was a little surprised at the RAM storage requirements, here's an example:
N <- 10000
closureList <- vector("list", N)
nsize = sample(x
1999 May 15
2
vsize and nsize
I am running R version ??? under Redhat 5.2. It seems as though the
--nsize object has no effct on the size of the allocated Ncells as
determined using gc(). Yes, I have that much data....
That is if I envoke R with
R --vsize 100 --nsize 5000000
then type
gc()
I get
free total
Ncells 92202 200000
Vcells 12928414 13107200
Thanks
Tony Long
Ecology and Evolutionary Biology
Steinhaus
2005 Jul 07
2
r: LOOPING
hi all
i know that one should try and limit the amount of looping in R
programs. i have supplied some code below. i am interested in seeing how
the code cold be rewritten if we dont use the loops.
a brief overview of what is done in the code.
==============================================
==============================================
==============================================
1. the input
1999 Apr 27
2
Memory management
Dear all,
I don't get it:
First of all, the help doesn't say what are the memory limits of
R. Say, what's the max heap size for instance ????
Secondly, I invoke R with the following commands each time:
rgui --vsize 30M --nsize 1000K
rgui --vsize 30M --nsize 2000K
rgui --vsize 30M --nsize 3000K
rgui --vsize 30M --nsize 4000K
I try to open a matrix 8000x8000 by issuing
1999 Oct 06
2
R --nsize 2M runs havoc (under linux)
Dear All,
I am running R version 0.65.0 under
a) Suse-Linux 6.1, and Suse-Linux 6.2, compiler gcc-2.95, CPUs pentium pro
200, 128MB, and pentium II 450, 128MB
b) Solaris 5.7, compiler gcc-2.95, cpu SUN sparc, 4000MB
When I set --nsize to more than 1M, R's internal storage management runs
havoc. gc() indicates the requested sizes, but the overall process size is
much too big: Running R with
2010 Jan 19
2
Server hanging despite efforts to correct memory limits
My group is working with datasets between 100 Mb and 1 GB in size, using
multiple log ins. From the documentation, it appears that vsize is limited
to 2^30-1, which tends to prove too restrictive for our use. When we drop
that restriction (set vsize = NA) we end up hanging the server, which
requires a restart. Is there any way to increase the memory limits on R
while keeping our jobs from
1999 Apr 12
3
--nsize and --vsize
Martin M has suggested I widen this discussion to R-devel, and
> I agree that we should increase them,
> but I'm not sure at all about the amount.
>
> The default could even depend on the architecture (via "./configure")..
Views, please.
------------- Begin Forwarded Message -------------
Is is not time we increased the defaults a bit? As the base gets bigger
I hit
2001 Aug 22
1
Huge workspace cannot be opened
Hi everyone,
I have a problem that some people may have already encountered but i did not
find the solution yet.
As I use R to simulate several arrays of data, my workspace is now 35Mb big and
I cannot launch R with it.
An "xdr real data read error occured" and R tells me to delete .RData or
increase memory. I WON'T delete this file and changing the max-nsize to 40600k
did not
2013 Aug 21
2
[PATCH 1/3] Rationalise whitespace to 4 space indentation with no trailing spaces
RHSrvAny.c was using a mixture of 4 space indentation, and tabs with a width of
4. This commit rationalises the whitespace to use only 4 space indentation, and
removes trailing whitespace.
---
RHSrvAny/RHSrvAny.c | 537 ++++++++++++++++++++++++++--------------------------
RHSrvAny/RHSrvAny.h | 1 -
RHSrvAny/resource.h | 2 +-
3 files changed, 269 insertions(+), 271 deletions(-)
diff --git
2004 Jul 20
1
--max-vsize and --max-nsize linux?
Hi,
somtimes i have trivial recodings like this:
> dim(tt)
[1] 252382 98
system.time(for(i in 2:length(tt)){
tt[,i][is.na(tt[,i])] <- 0
})
...and a win2000(XP2000+,1GB) machine makes it in several minutes, but
my linux notebook (XP2.6GHZ,512MB) don't get success after some hours.
I recognize that the cpu load is most time relative small, but the hardisk
2015 Jul 23
2
Método S3 paquete
...lando, cuyo argumento principal es "res" que
básicamente es una lista con un solo componente. Pero si el segundo
argumento llamado "oneSize" es FALSE, "res" es una lista de listas.
Lo que he escrito hasta el momento es lo siguiente:
anthr <- function(res, oneSize, nsizes){
UseMethod("anthr")
}
anthr.tri <- function(res, oneSize, nsizes){
if(oneSize){
cases <- c()
cases <- res$meds
}else{
cases <- list()
for (i in 1 : (nsizes - 1)){
cases[[i]] <- res[[i]]$meds
}
}
return(cases)
}
El problema cuando...
2015 Jan 17
0
default min-v/nsize parameters
Martin Morgan discussed this a year or so ago and as I recall bumped
up these values to the current defaults. I don't recall details about
why we didn't go higher -- maybe Martin does. I suspect the main
concern would be with small memory machines in student labs and less
developed countries. If there was a way on all platforms to identify
how much memory is available that might help to
2000 Oct 02
3
R vs S-PLUS with regard to memory usage
I am trying to translate code from S-PLUS to R and R really struggles!
After starting R with the foll.
R --vsize 50M --nsize 6M --no-restore
on a 400 MHz Pentium with 192 MB of memory running Linux (RH 6.2),
I run a function that essentially picks up an external dataset with 2121
rows
and 30 columns and builds a lm() object and also runs step() ... the step()
takes forever to run...(takes very
2013 Aug 29
5
[PATCH 1/6] Rationalise whitespace to 4 space indentation with no trailing spaces
RHSrvAny.c was using a mixture of 4 space indentation, and tabs with a width of
4. This commit rationalises the whitespace to use only 4 space indentation, and
removes trailing whitespace.
---
RHSrvAny/RHSrvAny.c | 537 ++++++++++++++++++++++++++--------------------------
RHSrvAny/RHSrvAny.h | 1 -
RHSrvAny/resource.h | 2 +-
3 files changed, 269 insertions(+), 271 deletions(-)
diff --git
2000 Aug 17
2
R on os390
G'day R friends,
I didn't get any replies on the main list so I thought I'd try with the
experts.
I was wondering if anyone's ported R to os390. If so, are the vsize and
nsize limits the same as other platforms?
I could really annoy those SAS guys then.
thanks,
John Strumila
john.strumila@team.telstra.com
2001 Mar 01
3
How do you expand memory capability (Was: R crashes in Windows ME)
Hello-
Since my data bank in SPSS has > 40 variables, I think that R crashes because of the memory limit.
In Maindonald?s UsingR text, on pg 3, there?s a footnote that reads:
"If you want larger memory space than the default you may want a target akin to
<path to binary>\rw091\bin\rgui.exe --visize 30M --nsize 1000K
[The default is --vsize 6M --nsize 250K
2005 Dec 20
1
Problems in batch mode
Dear R-users,
I am trying to run some simulations in batch mode. In an older version
of the program, I used
rterm --vsize=100M --nsize=5000K --restore --save <input file> output file,
however, in the new version R 2.2.0 , the parameters vsize and nsize are
ignored.
I can use the command memory.limit to increase memory, but I am not sure if
this corresponds to vsize and nsize.