similar to: alternative to rbind for data.table

Displaying 20 results from an estimated 3000 matches similar to: "alternative to rbind for data.table"

2007 Mar 31
1
Probem with argument "append" in "Rprof"
Hello, Appending information to the profiler's output seems to generate problems. Here is a small example of code : <code r> require(boot) Rprof( memory.profiling = TRUE) Rprof(NULL) for(i in 1:2){ Rprof( memory.profiling = TRUE, append = TRUE) example(boot) Rprof(NULL) } </code> The problem is that the file Rprof.out contains more than once the header information: $ grep
2011 Feb 11
1
Help optimizing EMD::extrema()
Hi folks, I'm attempting to use the EMD package to analyze some neuroimaging data (timeseries with 64 channels sampled across 1 million time points within each of 20 people). I found that processing a single channel of data using EMD::emd() took about 8 hours. Exploration using Rprof() suggested that most of the compute time was spent in EMD::extrema(). Looking at the code for EMD:extrema(),
2012 Sep 17
3
eval(parse(...)) only once in a function
Hi I would like to have something like str <- "df$JT == 12" fun <- function(df) { b <- eval(parse(str)) return(b) } but for performance "eval(parse(a))" should not be evaluated at each function call, but should work as fun <- function(df) { b <- df$JT == 12 return(b) } Do you have an idea how I can implement this? Thx Christof
2010 Jan 05
1
Naming functions for the purpose of profiling
Hi all, I have some long-running code that I'm trying to profile. I am seeing a lot of time spent inside the <Anonymous> function. Of course, this can in fact be any of several functions, but I am unable to see how I could use the information from Rprof.out to discern which function is taking the most time. An example line from my Rprof.out is: rbernoulli <Anonymous>
2010 Nov 19
1
memory profiling
I'm trying to configure Version 2.12.0 or R to do memory profiling. I've reconfigured the code: % ./compile --enable-memory-profiling=YES and verified that it's configured correctly by examining the output. I then rebuild R: % make Then I fire up R and run a script, using Rprof with the memory-profiling switch set to TRUE: Rprof("output", memory.profiling=TRUE); # a
2007 Aug 23
2
read big text file into R
Dear Rs: Hi, I am trying to read a big text file (nrows=243440, ncols=144). It seems the computational time of all the read methods (scan,readtable,read.delim) is not linear to the number of rows I want to read in: things became really slow once I tried to read in 100000 lines compare to 10000 lines). If I am reading the profiling result right, I guess scan wouldn't help either. My
2009 Jun 12
1
Rprof loses all system() time
Rprof seems to ignore all time spent inside system() calls. E.g., this simple example actually takes about 10 seconds, but Rprof thinks the total time is only 0.12 seconds: > Rprof("sleep-system.out") ; system.time(system(command="sleep 10")) ; Rprof(NULL) user system elapsed 0.000 0.004 10.015 > summaryRprof("sleep-system.out")$by.total
2009 Mar 03
1
profiler and loops
Hello, (This is follow up from this thread: http://www.nabble.com/execution-time-of-.packages-td22304833.html but with a different focus) I am often confused by the result of the profiler, when a loop is involved. Consider these two scripts: script1: Rprof( ) x <- numeric( ) for( i in 1:10000){ x <- c( x, rnorm(10) ) } Rprof( NULL ) print( summaryRprof( ) ) script2:
2013 Apr 05
2
line profiling
Hello, This is about the new "line profiling" feature in R 3.0.0. As I was testing it, I find the results somewhat disappointing so I'd like to get your opinion. I put some poorly written code in a test.R file, here are the contents: double <- function(x) { out <- c() for (i in x) { out <- c(out, 2*i) # line 4 } return(out) } Then this how I source the file
2012 Oct 26
2
connect points in charts
Hi is there a automatic way that long distances between points are not connected. I have something like plot(x,y,type="o",...) atx <- seq(as.Date("2009-04-01"),as.Date("2011-04-01"),"month") axis.Date(1, at=atx,labels=format(atx, "%b\n%Y"), padj=0.5 ) but I do not want lines between points whose distance is greater than two weeks. thx
2012 Jul 02
2
save conditions in a list
Hi how would you save conditions like a = "day > 100"; b = "val < 50"; c = "year == 2012" in a list? I like to have variables like "day", "val", "year" and a list of conditions list(a,b,c). Then I want to check if a & b & c is true or if a | b | c is true or similar things. Greetings Christof
2012 Aug 14
3
self-starter functions for y = a + b * c^x
Hi there are some predefined self-start functions, like SSmicmen, SSbiexp, SSasymp, SSasympOff, SSasympOrig, SSgompertz, SSflp, SSlogis, SSweibull, Quadratic, Qubic, SSexp (nlrwr) Btw, do you know graphic examples for this functions? The SSexpDecay (exponential decay) for y = (y0 - plateau)*exp(-k*x) + plateau from
2004 Jul 16
3
interpreting profiling output
I have some trouble interpreting the output from profiling. I have read the help pages Rprof, summaryRprof and consult the R extensions manual, but I still have problems understanding the output. Basically the output consist of self.time and total.time. I have the understanding that total.time is the time spent in a given function including any subcalls or child functions or whatever the
2008 Aug 26
1
Dramatic slowdown of R 2.7.2?
Dear R users/developers, simple comparison of code execution time of R 2.7.1 and R 2.7.2 shows a dramatic slowdown of the newer version. Rprof() identifies .Call function as a main cause (see the code below). What happened with R 2.7.2? Kind regards Marek Wielgosz Bayes Consulting ######### Probably useful info ############### ### CPU: Core2Duo T 7300, 2 GB RAM ### WIN XP ### both standard
2009 Oct 19
2
how to get rid of 2 for-loops and optimize runtime
Short: get rid of the loops I use and optimize runtime Dear all, I want to calculate for each row the amount of the month ago. I use a matrix with 2100 rows and 22 colums (which is still a very small matrix. nrows of other matrixes can easily be more then 100000) Table before Year month quarter yearmonth Service ... Amount 2009 9 Q3 092009 A ...
2012 Oct 02
3
lattice xyplot, get current level
Hi xyplot(y ~ x | subject) plots a separate graph of y against x for each level of subject. But I would like to have an own function for each level. Something like xyplot(y ~ x | subject, panel = function(x,y) { panel.xyplot(x,y) panel.curve(x,y) { # something that dependents on the current subject ... } }) How I get the current
2012 Jan 20
3
break an axis.POSIXct
Hi I like to use "axis.POSIXct" to plot days from 2006 till 2008. But I only have datas for the summer months. Is it possible to get two axis breaks, to have not so long distances without points? thx Christof
2017 May 18
1
Interpreting R memory profiling statistics from Rprof() and gc()
Sorry, this might be a really basic question, but I'm trying to interpret the results from memory profiling, and I have a few questions (marked by *Q#*). From the summaryRprof() documentation, it seems that the four columns of statistics that are reported when setting memory.profiling=TRUE are - vector memory in small blocks on the R heap - vector memory in large blocks (from malloc) - memory
2012 Dec 11
1
Debian packaging and openblas related crash when profiling in R
Hello R-sig-debian and (hopefully) Dirk: On Debian wheezy, I have the R packaging that CRAN (you) provide. I run into a little trouble while trying to fiddle with alternative BLAS. I know you and I went around on this last year and I think perhaps I've found something wrong in the framework, or I've just done something wrong. I installed the packages openblas-base and openblas-dev, and
2012 Jan 09
2
RODBC vs gdata
Hi one col in my Excel file contains many numbers. But on line 3000 and some other lines are strings like "FG 1". "RODBS" seems to omit this lines. "gdata" works, but is much slower. Is this a bug of RODBC or do I apply it wrong? Example with the same "file.xlsx" library(RODBC); excel <- odbcConnectExcel2007("file.xlsx") tab <-