similar to: To moderator

Displaying 20 results from an estimated 1100 matches similar to: "To moderator"

2011 Nov 13
1
Understand Ncells and Vcells, from gc()
Dear all, I am working on a 64 bits Linux system. I issue the following R commands: > rm(list=ls()) # To remove all objects in the workspace. > gc() # To free memory. used (Mb) gc trigger (Mb) max used (Mb) Ncells 124250 6.7 350000 18.7 350000 18.7 Vcells 124547 1.0 786432 6.0 476934 3.7 > gc() # I had to do it again, don't know why! used (Mb) gc trigger (Mb) max used (Mb) Ncells
2011 Oct 20
3
Strange R behavior for product of two sum of integers
Dear gentlemen, Can you explain me why the following happens (any OS I think, and even on 64 bits)? > sum(1000:1205)^2 [1] 51581223225 > sum(1000:1205)*sum(1000:1205) [1] NA Warning message: In sum(1000:1205) * sum(1000:1205) : NAs produced by integer overflow Best, Pierre -- Pierre Lafaye de Micheaux Adresse courrier: D?partement de Math?matiques et Statistique Universit? de
2007 Oct 09
2
bug: wireframe and tcltk
Dear R users, When i call the wireframe function from within a tcltk widget, it does not work. Here is a sample program that shows the bug. When you try: is.it.a.bug() You can see that the "curve" instruction is run but not the "wireframe" one. Do you have an explanation about this point? Best regards, Pierre Lafaye de Micheaux -- Pierre Lafaye de Micheaux Bureau
2008 Aug 11
2
Compiling only some C files
Dear R-users, I am currently writing an R package that contains three C++ files in the src/ source directory When i issue a R CMD check pkge_name, all the *.cpp files in the src/ directory will be compiled through something like (seen in 00install.out file): * Installing *source* package 'pkge_name' ... ** libs g++ -I/usr/lib/R/include -I/usr/lib/R/include -I/usr/local/include -c
2008 Apr 07
0
Some memory questions: data.frame and lists.
Hi there, I seek your expert opinion on the following memory related questions. The output below was gotten from R-2.6.2, compiled with --enable-memory-profiling on Ubuntu Linux. ======================================================================= >>> Code and output 1: > gc( ) used (Mb) gc trigger (Mb) max used (Mb) Ncells 131180 7.1 350000 18.7 350000 18.7
2007 May 25
0
Recent changes in R related to CHARSXPs
Hello all, I want to highlight a recent change in R-devel to the larger developeR community. As of r41495, R maintains a global cache of CHARSXPs such that each unique string is stored only once in memory. For many common use cases, such as dimnames of matrices and keys in environments, the result is a significant savings in memory (and time under some circumstances). A result of these changes
2001 Nov 06
0
problem compiling on a SParc4 Sun4m Solaris2.6
I am working with Solaris 2.6 and with the version of openssh 2.9.9-p2 I was able to compile for a UltraSpar 5 Sun4u with no problem. But fot a Sparc4 Sun4m i'm getting this message: ./ssh : Exec format error. Wrong Architechture. Do you have a hit why it's not working, is there and option i have to set? thank you for your help jean ============================================ Jean
2008 Sep 09
1
Memory allocation problem (during kmeans)
Dear all, I am trying to apply kmeans clusterring on a data file (size is about 300 Mb) I read this file using x=read.table('file path' , sep=" ") then i do kmeans(x,25) but the process stops after two minutes with an error : Error: cannot allocate vector of size 907.3 Mb when i read the archive i notice that the best solution is to use a 64bit OS. "Error messages
2007 Jun 26
1
Memory Experimentation: Rule of Thumb = 10-15 Times the Memory
dear R experts: I am of course no R experts, but use it regularly. I thought I would share some experimentation with memory use. I run a linux machine with about 4GB of memory, and R 2.5.0. upon startup, gc() reports used (Mb) gc trigger (Mb) max used (Mb) Ncells 268755 14.4 407500 21.8 350000 18.7 Vcells 139137 1.1 786432 6.0 444750 3.4 This is my baseline. linux
2007 Sep 27
0
Unnecessary extra copy with matrix(..., dimnames=NULL) (Was: Re: modifying large R objects in place)
As others already mentioned, in your example you are first creating an integer matrix and the coercing it to a double matrix by assigning (double) 1 to element [1,1]. However, even when correcting for this mistake, there is an extra copy created when using matrix(). Try this in a fresh vanilla R session: > print(gc()) used (Mb) gc trigger (Mb) max used (Mb) Ncells 136684 3.7
2012 Jan 06
0
interesting connection / finalizer bug?
This setOldClass(c("file", "connection")) .A <- setRefClass("A", fields=list(con="connection"), methods=list( finalize = function() { if (isOpen(con)) close(con) })) f <- tempdir() a <- .A$new(con=file(f, "rb")) close(a$con) a <- .A$new(con=file(f,
2008 Jan 30
1
Understanding an R improvement that already occurred.
I was surprised to observe the following difference between 2.4.1 and 2.6.0 after a long overdue upgrade a few months ago of our departmental server. It wasn't a bug fix, but a subtle improvement. Here's the simplest example I could create. The size is excessive, on the order of the Netflix Competition data. The integer matrix is about 1.12 GB, and if coerced to numeric it is 2.24 GB.
2005 Dec 14
2
The fastest way to select and execute a few selected functions inside a function
Dear useRs? I have the following problem! I have a function that calls one or more functions, depending on the input parameters. I am searching for the fastest way to select and execute the selected functions and return their results in a list. The number of possible functions is 10, however usually only 2 are selected (although sometimes more, even all). For examples, if I have function
2008 Jul 20
2
Erro: cannot allocate vector of size 216.0 Mb
Please, I have a 2GB computer and a huge time-series to embedd, and i tried increasing memory.limit() and memory.size(max=TRUE), but nothing. Just before the command: > memory.size(max=TRUE) [1] 13.4375 > memory.limit() [1] 1535.875 > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 209552 5.6 407500 10.9 350000 9.4 Vcells 125966 1.0 786432 6.0 496686 3.8
2005 Nov 15
1
cannot.allocate.memory.again and 32bit<--->64bit
hello! ------ i use 32bit.Linux(SuSe)Server, so i'm limited with 3.5Gb of memory i demonstrate, that there is times to times a problem with allocating of objects of large size, for example 0.state (no objects yet created) ------------------------------------ > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 162070 4.4 350000 9.4 350000
2007 Aug 09
1
Memory Experimentation: Rule of Thumb = 10-15 Times the Memory
Hi, I've been having similar experiences and haven't been able to substantially improve the efficiency using the guidance in the I/O Manual. Could anyone advise on how to improve the following scan()? It is not based on my real file, please assume that I do need to read in characters, and can't do any pre-processing of the file, etc. ## Create Sample File
2011 Jan 17
1
isoreg memory leak?
I believe there is a memory leak in isoreg in the current version of R, as I believe the following shows > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 120405 3.3 350000 9.4 350000 9.4 Vcells 78639 0.6 786432 6.0 392463 3.0 > for(k in 1:100) { + + y <- runif(10000) + isoreg(x,y) + } > rm(x) > rm(y) > gc() used (Mb) gc
2000 Feb 11
1
astonishing memory phenomenon
I have a question concerning memory. I understood that R takes a fixed amount of memory at startup (which I can influence with --vsize --nsize) and that gc() shows the memory still free of the total memory reserved for R. However, if I create a long vector of character data, gc() only seem to reflect the space needed for a vector of pointers to char, the space used for the character data itself
2006 May 16
2
Large database help
Hello all. I have a large .txt file whose variables are fixed-columns, ie, variable V1 goes from columns 1 to 7, V2 from 8 to 23 etc. This is a 60GB file with 90 variables and 60 million observations. I'm working with a Pentium 4, 1GB RAM, Windows XP Pro. I tried the following code just to see if I could work with 2 variables but it seems not possible: R : Copyright 2005, The R Foundation
2006 Sep 28
0
AIC in R
Dear R users, According Brockwell & Davis (1991, Section 9.3, p.304), the penalty term for computing the AIC criteria is "p+q+1" in the context of a zero-mean ARMA(p,q) time series model. They arrived at this criterion (with this particular penalty term) estimating the Kullback-Leibler discrepancy index. In practice, the user usually chooses the model whose estimated index is