similar to: Improved version of Rprofmem

Displaying 20 results from an estimated 200 matches similar to: "Improved version of Rprofmem"

2016 Jun 04
1
RProfmem output format
I'm picking up this 5-year old thread. 1. About the four memory allocations without a stacktrace I think the four memory allocations without a stacktrace reported by Rprofmem(): > Rprofmem(); x <- raw(2000); Rprofmem("") > cat(readLines("Rprofmem.out", n=5, warn=FALSE), sep="\n") 192 :360 :360 :1064 :2040 :"raw" are due to some
2017 Mar 07
0
length(unclass(x)) without unclass(x)?
> Henrik Bengtsson: > > I'm looking for a way to get the length of an object 'x' as given by > base data type without dispatching on class. The performance improvement you're looking for is implemented in the latest version of pqR (pqR-2016-10-24, see pqR-project.org), along with corresponding improvements in several other circumstances where unclass(x) does not
2011 May 13
1
RProfmem output format
Hi all, When I run the example in RProfmem, I get: Rprofmem("Rprofmem.out", threshold=1000) example(glm) Rprofmem(NULL) noquote(readLines("Rprofmem.out", n=5)) ... [1] 1384 :5416 :5416 :1064 :1064 :"readRDS" "index.search" "example" [2] 1064 :"readRDS" "index.search" "example" [3] 4712
2018 Jan 27
1
R (>= 3.4.0): integer-to-double coercion in comparisons no longer done (a good thing)
Hi, there was a memory improvement done in R going from R 3.3.3 to R 3.4.0 when it comes to comparing an integer 'x' an double 'y' (either may be scalar or vector). For example, in R 3.3.3, I get: > getRversion() [1] '3.3.3' > x <- integer(1000) > y <- double(1000) > profmem::profmem(z <- (x < y)) Rprofmem memory profiling of: z <- (x < y)
2017 May 18
1
Interpreting R memory profiling statistics from Rprof() and gc()
Sorry, this might be a really basic question, but I'm trying to interpret the results from memory profiling, and I have a few questions (marked by *Q#*). From the summaryRprof() documentation, it seems that the four columns of statistics that are reported when setting memory.profiling=TRUE are - vector memory in small blocks on the R heap - vector memory in large blocks (from malloc) - memory
2008 Jan 23
2
R binary version with R_MEMORY_PROFILING
Hi all, Where can I find an R binary version (>2.4.0 ) for windows that compiled with R_MEMORY_PROFILING? Within our application we are experiencing serious problems with memory usage. And being able to use "Rprofmem" and "tracemem" command seems like our best option. Thanks, Yoni [[alternative HTML version deleted]]
2016 Sep 23
0
Undocumented 'use.names' argument to c()
I'd expect that a lot of the performance overhead could be eliminated by simply improving the underlying code. IMHO, we should ignore it in deciding the API that we want here. On Fri, Sep 23, 2016 at 10:54 AM, Henrik Bengtsson <henrik.bengtsson at gmail.com> wrote: > I'd vote for it to stay. It could of course suprise someone who'd > expect c(list(a=1), b=2, use.names =
2018 Jan 27
0
sum() returns NA on a long *logical* vector when nb of TRUE values exceeds 2^31
>>>>> Henrik Bengtsson <henrik.bengtsson at gmail.com> >>>>> on Thu, 25 Jan 2018 09:30:42 -0800 writes: > Just following up on this old thread since matrixStats 0.53.0 is now > out, which supports this use case: >> x <- rep(TRUE, times = 2^31) >> y <- sum(x) >> y > [1] NA > Warning message:
2014 Mar 05
1
[PATCH] Code coverage support proof of concept
Hello, I submit a patch for review that implements code coverage tracing in the R interpreter. It records the lines that are actually executed and their associated frequency for which srcref information is available. I perfectly understands that this patch will not make its way inside R as it is, that they are many concerns of stability, compatibility, maintenance and so on. I would like to have
2009 Jan 13
1
Summary of Total Object.Size in R Script
Dear all, Is there a way we can find the total object.size of all the objects in our R script? The reason we want to do this because we want to know how much memory does our R script require overall. Rprofmem(), doesn't seem to do it. and Unix 'top' command is dynamic and it doesn't give the exact byte size. - Gundala Viswanath Jakarta - Indonesia
2018 Jan 25
2
sum() returns NA on a long *logical* vector when nb of TRUE values exceeds 2^31
Just following up on this old thread since matrixStats 0.53.0 is now out, which supports this use case: > x <- rep(TRUE, times = 2^31) > y <- sum(x) > y [1] NA Warning message: In sum(x) : integer overflow - use sum(as.numeric(.)) > y <- matrixStats::sum2(x, mode = "double") > y [1] 2147483648 > str(y) num 2.15e+09 No coercion is taking place, so the
2014 Jun 30
0
[PATCH 1/1] ia64: use ARRAY_SIZE instead of sizeof/sizeof[0]
Use macro definition Cc: Jeremy Fitzhardinge <jeremy at goop.org> Cc: Chris Wright <chrisw at sous-sol.org> Cc: virtualization at lists.linux-foundation.org Cc: linux-ia64 at vger.kernel.org Signed-off-by: Fabian Frederick <fabf at skynet.be> --- This is untested. arch/ia64/kernel/paravirt.c | 8 +++----- 1 file changed, 3 insertions(+), 5 deletions(-) diff --git
2014 Jun 30
0
[PATCH 1/1] ia64: use ARRAY_SIZE instead of sizeof/sizeof[0]
Use macro definition Cc: Jeremy Fitzhardinge <jeremy at goop.org> Cc: Chris Wright <chrisw at sous-sol.org> Cc: virtualization at lists.linux-foundation.org Cc: linux-ia64 at vger.kernel.org Signed-off-by: Fabian Frederick <fabf at skynet.be> --- This is untested. arch/ia64/kernel/paravirt.c | 8 +++----- 1 file changed, 3 insertions(+), 5 deletions(-) diff --git
2018 Feb 01
0
sum() returns NA on a long *logical* vector when nb of TRUE values exceeds 2^31
>>>>> Herv? Pag?s <hpages at fredhutch.org> >>>>> on Tue, 30 Jan 2018 13:30:18 -0800 writes: > Hi Martin, Henrik, > Thanks for the follow up. > @Martin: I vote for 2) without *any* hesitation :-) > (and uniformity could be restored at some point in the > future by having prod(), rowSums(), colSums(), and others >
2016 Sep 23
2
Undocumented 'use.names' argument to c()
I'd vote for it to stay. It could of course suprise someone who'd expect c(list(a=1), b=2, use.names = FALSE) to generate list(a=1, b=2, use.names=FALSE). On the upside, is the performance gain from using use.names=FALSE. Below benchmarks show that the combining of the names attributes themselves takes ~20-25 times longer than the combining of the integers themselves. Also, at no
1997 Jul 28
0
R-alpha: R 0.50.a1 S_alloc BUG, priority = URGENT
The current version of S_alloc in src/main/memory.c is char *S_alloc(long nelem, int eltsize) { unsigned int i, size; char *p = R_alloc(nelem, eltsize); for(i=0 ; i<size; i++) p[i] = 0; return p; } which segfaults because `size' is not initialized. I am not what the right fix is, adding size = nelem * eltsize; before the loop seems to work. As an aside ... I think the seed*
2023 Mar 30
1
write.csv performance improvements?
Dear R-devel, I did a systematic comparison of write.csv with similar functions, and observed two asymptotic inefficiencies that could be improved. 1. write.csv is quadratic time (N^2) in the number of columns N. Can write.csv be improved to use a linear time algorithm, so it can handle CSV files with larger numbers of columns? For more details including figures and session info, please see
2016 Sep 25
1
Undocumented 'use.names' argument to c()
>From comments in http://stackoverflow.com/questions/24815572/why-does-function-c-accept-an-undocumented-argument/24815653 : The code of c() and unlist() was formerly shared but has been (long time passing) separated. From July 30, 1998, is where do_c got split into do_c and do_unlist. With the implementation of 'c.Date' in R devel r71350, an argument named 'use.names' is
2018 Jan 30
2
sum() returns NA on a long *logical* vector when nb of TRUE values exceeds 2^31
Hi Martin, Henrik, Thanks for the follow up. @Martin: I vote for 2) without *any* hesitation :-) (and uniformity could be restored at some point in the future by having prod(), rowSums(), colSums(), and others align with the behavior of length() and sum()) Cheers, H. On 01/27/2018 03:06 AM, Martin Maechler wrote: >>>>>> Henrik Bengtsson <henrik.bengtsson at gmail.com>
2016 Jan 05
0
R, AIX 64-bit builds - trying to understand root cause for message: "Error: Line starting 'Package: tools ...' is malformed!"
On 04-Jan-16 23:24, Michael Felt wrote: > The bulk is on my forums - the final post for today is: > > Results to date: > > A. It looks like I am going to need a newer compiler for C - xlc/xlC > V11 apparently does not understand this code: > > "/data/prj/cran/R-3.2.3/src/main/memory.c", line 2149.31: 1506-046 (S) > Syntax error. > > I will have to check