Displaying 20 results from an estimated 300 matches similar to: "default min-v/nsize parameters"
2015 Jan 18
2
default min-v/nsize parameters
On Thu, Jan 15, 2015 at 3:55 PM, Michael Lawrence
<lawrence.michael at gene.com> wrote:
> Just wanted to start a discussion on whether R could ship with more
> appropriate GC parameters.
I've been doing a number of similar measurements, and have come to the
same conclusion. R is currently very conservative about memory usage,
and this leads to unnecessarily poor performance on
2015 Jan 20
1
default min-v/nsize parameters
>>>>> Peter Haverty <haverty.peter at gene.com>
>>>>> on Mon, 19 Jan 2015 08:50:08 -0800 writes:
> Hi All, This is a very important issue. It would be very
> sad to leave most users unaware of a free speedup of this
> size. These options don't appear in the R --help
> output. They really should be added there.
Indeed,
2015 Jan 19
0
default min-v/nsize parameters
Hi All,
This is a very important issue. It would be very sad to leave most users
unaware of a free speedup of this size. These options don't appear in the
R --help output. They really should be added there. Additionally, if the
garbage collector is working very hard, might it emit a note about better
setting for these variables?
It's not really my place to comment on design philosophy,
1999 Apr 12
3
--nsize and --vsize
Martin M has suggested I widen this discussion to R-devel, and
> I agree that we should increase them,
> but I'm not sure at all about the amount.
>
> The default could even depend on the architecture (via "./configure")..
Views, please.
------------- Begin Forwarded Message -------------
Is is not time we increased the defaults a bit? As the base gets bigger
I hit
2001 Jan 07
2
"Invalid character 32" problem with Pager??
Hi all,
I'm having a very minor problem. Most every time I use the pager (I think) I
get the following error about character 32. It works fine, but the error is
a bit annoying.
> ?version
sh: invalid character 32 in exportstr for export R_NSIZE
sh: invalid character 32 in exportstr for export R_VSIZE
R.Version package:base R Documentation
(and then the
1998 Aug 22
1
R-beta: re -n -v wr0613b - windows dynload
When I use the -v I can modify the size of the heap, as assessed by
gc(), but the -n key seems to be without effect ?
On a machine with 48 mB ram I can load the libraries without problem,
but on my own 36 mB ram machine, I get dynload problems with the larger,
eg survival4.
Any suggestions ?
Troels
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list
1998 Aug 22
1
R-beta: re -n -v wr0613b - windows dynload
When I use the -v I can modify the size of the heap, as assessed by
gc(), but the -n key seems to be without effect ?
On a machine with 48 mB ram I can load the libraries without problem,
but on my own 36 mB ram machine, I get dynload problems with the larger,
eg survival4.
Any suggestions ?
Troels
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list
1999 Dec 17
1
R CMD check --help
This example from the INSTALL help seems to be broken in R 0.90.1 (on Solaris):
gilp/dse : R CMD check --help
Usage: R CMD check [options] [-l lib] pkg_1 ... pkg_n
I'm trying to figure out how to request more nsize and vsize when using R CMD
check.
Paul Gilbert
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-devel mailing list -- Read
1999 Sep 24
2
R's startup : .Rprofile & .Renviron -- info and RFC
[RFC = Request for Comments]
{ Yes, the documentation for .Renviron is really not there (but the FAQ...);
the rest is in ?Startup }
In R's Startup (on Unix only??) {unless --no-environ is specified}
~/.Renviron (if there) is read as an 'sh' script before R is called,
then R looks ((for the site-wide Rprofile and then))
for .Rprofile in the current directory and then for
2000 Apr 19
1
R CMD check seg fault in Linux
For some of my packages I am getting a segmentation fault in Linux when
I use R CMD check. (Using R 1.0.1) The segmentation fault does not
happen in Solaris and in some cases it does not happen in Linux if I set
R_NSIZE and R_VSIZE much higher than I need in Solaris. Should I expect
a segmentation fault if there is not enough memory for R CMD check, or
are these unrelated?
Paul Gilbert
1997 Nov 27
2
R-beta: Memory Management in R-0.50-a4
Dear R users
we're having a problem reading a largish data file using
read.table(). The file consists of 175000 lines of 4
floating pt numbers. Here's what happens:
> dat_read.table('sst.dat')
Error: memory exhausted
(This is line 358 of src/main/memory.c).
Cutting down the file to around 15000 lines allows
read.table() to work OK.
I edited the memory limits in Platform.h
1999 Aug 30
1
interface w/ emacs (PR#261)
Full_Name: Laurent Gautier
Version: 0.65.0
OS: mips SGI-Irix 6.5
Submission from: (NULL) (195.110.4.98)
Using R through emacs with ess5.1.8, I cannot set R workspace (--vsize and
--nsize).
So far I was using R0.64.2 without such a problem. I am aware my bug report is a
bit light,
but just let me know if anything I could do with my R and emacs would be of any
help for\
specifying better what is
1999 Jul 08
1
Gnome interface status report
Hi,
The Gnome version now compiles, and it should also be working (at least as much as it
ever has). I've changed Makefile.in to the new system, which is very cool. What I
want to work on now is:
- Graphics. I want to move to the Gnome canvas for this, which should be reasonably
easy. This will give us rotated text (which I never got going properly before) and
the option for
2000 Oct 11
2
invalid regular expression after many grep's (PR#691)
Full_Name: J Utans
Version: 1.1.1
OS: NT4 (SP6)
Submission from: (NULL) (155.140.123.250)
After grep is called many times (> 250k), with constant strings as patterns,
it complains with "invalid regular expression" on calls that worked before
(with same pattern and x). At the same time copying to the clipboard
does no longer work with "out of memory" error (i.e. when trying
2009 Nov 18
2
Unnecesary code?
Dear R-ers,
While browsing the R sources, I found the following piece of code
in src\main\memory.c:
static void reset_pp_stack(void *data)
{
R_size_t *poldpps = data;
R_PPStackSize = *poldpps;
}
To me, it looks like the poldpps pointer is a nuissance; can't you
just cast the data pointer and derefer it at once? Say,
static void reset_pp_stack(void *data)
{
R_PPStackSize = *
1999 Jan 27
1
cant restore .Rdata
Hi Folks,
I loaded a couple of quite large data sets into an R session and then
quit (after saving the image). Now I get:
Error: a read error occured
Fatal error: unable to restore saved data
(remove .RData or increase memory)
after trying to start my R session using something like:
R --vsize XXX --nsize 1000000
For any value of XXX (I went up to 300 or 400, which is as high as I could
go.
2000 Apr 18
1
increasing memory size
Dear R people,
I wonder if some kind person can tell me the correct syntax to set the
R_VSIZE environmental variable. I tried R_VSIZE = 10M in ~/.Renviron and
also export R_VSIZE = 10M. These don't seem to work. I scrounged around
looking for details about this, couldn't find any, got fed up.
Faheem.
2018 Oct 01
1
unexpected memory.limit on windows in embedded R
Dear All,
I'm linking R from another application and embedding it as described in the
R-exts manual, i.e. with initialization done via Rf_initEmbeddedR.
While everything works the same as in standalone R for Linux, under Windows
I found a difference in the default memory.limit, which is fixed to 2GB
(both win32 and win64) - compared to a limit in standalone R of 3.5GB for
win32 and 16GB on
2010 Jul 09
2
Suggestion for serialization performance improvement on Windows
Dear R developers,
The slow performance of serializing to a raw vector on Windows is an
issue that has appeared in this list before. It appears to be due to
the frequent use of realloc from the resize_buffer method in
serialize.c.
I suggest a more granular, but still incremental, re-allocation of
memory. For example change near the top of resize_buffer to:
R_size_t newsize = needed + 65536 -
2015 Jan 09
1
Cost of garbage collection seems excessive
When doing repeated regressions on large data sets, I'm finding that
the time spent on garbage collection often exceeds the time spent on
the regression itself. Consider this test program which I'm running
on an Intel Haswell i7-4470 processor under Linux 3.13 using R 3.1.2
compiled with ICPC 14.1:
nate at haswell:~$ cat > gc.R
library(speedglm)
createData <- function(n) {