similar to: options(keep.source = TRUE) -- also for "library(.)" ?

Displaying 20 results from an estimated 3000 matches similar to: "options(keep.source = TRUE) -- also for "library(.)" ?"

2000 Apr 27
1
options(keep.source = TRUE) -- also for "library(.)" ?
help(options) contains keep.source: When `TRUE', the default, the source code for functions loaded by is stored in their `"source"' attribute, allowing comments to be kept in the right places. This does not apply to functions loaded by `library'. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ and R behaves as documented, i.e.,
1999 Nov 26
1
memory.profile() messes up the vector heap on Alpha/Linux?
Hello, I have been trying to debug a problem with R-0.90.0 (this bug was in 0.65.1, too). The following code results in seg faults. It doesn't seg-fault on Linux/intel. > memory.profile() > gc() As long as I don't execute memory.profile(), there is no problem with garbage collection. So I think that memory.profile() screws up the heap in some way. When it seg-faults, it dies in
2015 Jan 18
2
default min-v/nsize parameters
On Thu, Jan 15, 2015 at 3:55 PM, Michael Lawrence <lawrence.michael at gene.com> wrote: > Just wanted to start a discussion on whether R could ship with more > appropriate GC parameters. I've been doing a number of similar measurements, and have come to the same conclusion. R is currently very conservative about memory usage, and this leads to unnecessarily poor performance on
2015 Jan 20
1
default min-v/nsize parameters
>>>>> Peter Haverty <haverty.peter at gene.com> >>>>> on Mon, 19 Jan 2015 08:50:08 -0800 writes: > Hi All, This is a very important issue. It would be very > sad to leave most users unaware of a free speedup of this > size. These options don't appear in the R --help > output. They really should be added there. Indeed,
2015 Jan 15
2
default min-v/nsize parameters
Just wanted to start a discussion on whether R could ship with more appropriate GC parameters. Right now, loading the recommended package Matrix leads to: > library(Matrix) > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 1076796 57.6 1368491 73.1 1198505 64.1 Vcells 1671329 12.8 2685683 20.5 1932418 14.8 Results may vary, but here R needed 64MB of N cells and 15MB
2000 Apr 18
1
increasing memory size
Dear R people, I wonder if some kind person can tell me the correct syntax to set the R_VSIZE environmental variable. I tried R_VSIZE = 10M in ~/.Renviron and also export R_VSIZE = 10M. These don't seem to work. I scrounged around looking for details about this, couldn't find any, got fed up. Faheem.
2001 Jan 07
2
"Invalid character 32" problem with Pager??
Hi all, I'm having a very minor problem. Most every time I use the pager (I think) I get the following error about character 32. It works fine, but the error is a bit annoying. > ?version sh: invalid character 32 in exportstr for export R_NSIZE sh: invalid character 32 in exportstr for export R_VSIZE R.Version package:base R Documentation (and then the
2018 Jan 13
3
How to use stack maps
Is there an explanation anywhere of what code that uses a stack map looks like? I'm interested in writing a garbage collector, but it's not clear to me how my code should make use of the stack map format to actually locate roots in memory. -------------- next part -------------- An HTML attachment was scrubbed... URL:
2007 Aug 20
2
[LLVMdev] ocaml+llvm
On Aug 14, 2007, at 4:35 AM, Gordon Henriksen wrote: > On Aug 14, 2007, at 06:24, Gordon Henriksen wrote: > >> The two major problems I had really boil down to identifying GC >> points in machine code and statically identifying live roots at >> those GC points, both problems common to many collection >> techniques. Looking at the problem from that perspective
2015 Jan 09
1
Cost of garbage collection seems excessive
When doing repeated regressions on large data sets, I'm finding that the time spent on garbage collection often exceeds the time spent on the regression itself. Consider this test program which I'm running on an Intel Haswell i7-4470 processor under Linux 3.13 using R 3.1.2 compiled with ICPC 14.1: nate at haswell:~$ cat > gc.R library(speedglm) createData <- function(n) {
2018 Jan 14
0
How to use stack maps
Hi, I implemented a garbage collector for a language I wrote in college using the llvm gc statepoint infrastructure. Information for statepoints: https://llvm.org/docs/Statepoints.html Example usage of parsing the llvm stackmap can be found at: https://github.com/dotnet/llilc/blob/master/lib/GcInfo/GcInfo.cpp https://llvm.org/docs/StackMaps.html#stackmap-format
1998 Aug 22
1
R-beta: re -n -v wr0613b - windows dynload
When I use the -v I can modify the size of the heap, as assessed by gc(), but the -n key seems to be without effect ? On a machine with 48 mB ram I can load the libraries without problem, but on my own 36 mB ram machine, I get dynload problems with the larger, eg survival4. Any suggestions ? Troels -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list
1998 Aug 22
1
R-beta: re -n -v wr0613b - windows dynload
When I use the -v I can modify the size of the heap, as assessed by gc(), but the -n key seems to be without effect ? On a machine with 48 mB ram I can load the libraries without problem, but on my own 36 mB ram machine, I get dynload problems with the larger, eg survival4. Any suggestions ? Troels -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list
1999 Dec 17
1
R CMD check --help
This example from the INSTALL help seems to be broken in R 0.90.1 (on Solaris): gilp/dse : R CMD check --help Usage: R CMD check [options] [-l lib] pkg_1 ... pkg_n I'm trying to figure out how to request more nsize and vsize when using R CMD check. Paul Gilbert -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-devel mailing list -- Read
2000 Mar 13
1
check does not accept --vsize option (PR#481)
Full_Name: Markus Neteler Version: 1.0.0 OS: Linux 2.2.10/i686 Submission from: (NULL) (130.75.72.37) Hi, I wanted to "check" the R.GRASS GIS interface from Roger Bivand: http://www.geog.uni-hannover.de/grass/statsgrasslist.html using R CMD check --vsize=10M GRASS but: [error message shortened] > G <- gmeta() Error: heap memory (6144 Kb) exhausted [needed 1024 Kb more]
2007 Aug 20
0
[LLVMdev] ocaml+llvm
On Aug 19, 2007, at 20:43, Chris Lattner wrote: > On Aug 14, 2007, at 4:35 AM, Gordon Henriksen wrote: > >> On Aug 14, 2007, at 06:24, Gordon Henriksen wrote: >> >>> The two major problems I had really boil down to identifying GC >>> points in machine code and statically identifying live roots at >>> those GC points, both problems common to many
1999 Sep 24
2
R's startup : .Rprofile & .Renviron -- info and RFC
[RFC = Request for Comments] { Yes, the documentation for .Renviron is really not there (but the FAQ...); the rest is in ?Startup } In R's Startup (on Unix only??) {unless --no-environ is specified} ~/.Renviron (if there) is read as an 'sh' script before R is called, then R looks ((for the site-wide Rprofile and then)) for .Rprofile in the current directory and then for
2005 Dec 08
2
data.frame() size
Hi, In the example below why is d 10 times bigger than m, according to object.size ? It also takes around 10 times as long to create, which fits with object.size() being truthful. gcinfo(TRUE) also indicates a great deal more garbage collector activity caused by data.frame() than matrix(). $ R --vanilla .... > nr = 1000000 > system.time(m<<-matrix(integer(1), nrow=nr, ncol=2)) [1]
1998 Mar 09
2
R-beta: read.table and large datasets
I find that read.table cannot handle large datasets. Suppose data is a 40000 x 6 dataset R -v 100 x_read.table("data") gives Error: memory exhausted but x_as.data.frame(matrix(scan("data"),byrow=T,ncol=6)) works fine. read.table is less typing ,I can include the variable names in the first line and in Splus executes faster. Is there a fix for read.table on the way?
2020 Nov 01
2
parallel PSOCK connection latency is greater on Linux?
I'm exploring latency overhead of parallel PSOCK workers and noticed that serializing/unserializing data back to the main R session is significantly slower on Linux than it is on Windows/MacOS with similar hardware. Is there a reason for this difference and is there a way to avoid the apparent additional Linux overhead? I attempted to isolate the behavior with a test that simply returns