similar to: degraded performance with rank()

Displaying 20 results from an estimated 7000 matches similar to: "degraded performance with rank()"

2007 Jun 08
4
logical 'or' on list of vectors
Suppose I have a list of logicals, such as returned by lapply: Theoph$Dose[1] <- NA Theoph$Time[2] <- NA Theoph$conc[3] <- NA lapply(Theoph,is.na) Is there a direct way to execute logical "or" across all vectors? The following gives the desired result, but seems unnecessarily complex. as.logical(apply(do.call("rbind",lapply(Theoph,is.na)),2,"sum"))
2007 Jun 04
3
test for nested factors
Is there a conventional way to test for nested factors? I.e., if 'a' and 'b' are lists of same-length factors, does each level specified by 'a' correspond to exactly one level specified by 'b'? The function below seems to suffice, but I'd be happy to know of a more succinct solution, if it already exists. Thanks, Tim. --- "%nested.in%" <-
2007 Sep 27
3
testing the contents of an environment
Suppose I want to delete everything in my working directory that is not a function. It seems that sapply(ls(),is.function) always returns FALSE, because ls() returns objects of mode character. How do I evaluate is.function(), not on a character string, but on the object that character string represents? Thanks, Tim
2006 Apr 06
2
prevent reassignment of function names
Hi. I'm trying to find a systematic way to prevent assignment to names of existing functions. I've tried reassigning to the assignment operator, with mixed results. The function definition for "<-" below works as hoped for the demonstrated assignments to a and c. However, for the assignment based on the test function, it appears that the formal argument is not
2008 Sep 09
1
'xtfrm' performance (influences 'order' performance) in R devel
Hello everybody, it looks like the presense of some (do know know which) S4 methods for a given S4 class degrades the performance of xtfrm (used in 'order' in new R-devel) by a factor of millions. This is for classes that ARE derived from numeric directly and thus should be quite trivial to convert to numeric. Consider the following example: setClass("TimeDateBase",
2007 Apr 05
1
Extent of time zone vulerability for POSIX date and time classes
Hi. I frequently convert date and time data to and from character representations. I'm frustrated with chron, because 'seconds' are required to create a time object (my input data never has seconds). More importantly, I cannot make chron print the format 12/30/2006 (which my output data requires). I really like the format flexibility of strftime() and strptime(), but of course
2024 Apr 27
1
max on numeric_version with long components
? Sat, 27 Apr 2024 13:56:58 -0500 Jonathan Keane <jkeane at gmail.com> ?????: > In devel: > > max(numeric_version(c("1.0.1.100000000", "1.0.3.100000000", > "1.0.2.100000000"))) > [1] ?1.0.1.100000000? > > max(numeric_version(c("1.0.1.10000000", "1.0.3.10000000", > "1.0.2.10000000"))) > [1]
2013 May 01
1
foreign: write.xport
I see in the archives significant discussion about SAS, CDISC formats etc. for FDA, but no direct suggestion of adding a write.xport method to the foreign package. Are there significant barriers to doing so? [[alternative HTML version deleted]]
2007 Feb 27
1
fitting the gamma cumulative distribution function
Hi. I have a vector of quantiles and a vector of probabilites that, when plotted, look very like the gamma cumulative distribution function. I can guess some shape and scale parameters that give a similar result, but I'd rather let the parameters be estimated. Is there a direct way to do this in R? Thanks, Tim. week <- c(0,5,6,7,9,11,14,19,39) fraction <-
2013 Feb 02
1
setGeneric() gives "must supply skeleton" when checking package
r-devel, In a development version of the CRAN package metrumrg, I write ... require(reshape) setGeneric('cast') setOldClass(c('keyed','data.frame')) setMethod('cast','keyed', function ...) The result is satisfactory when sourcing the code directly, but when checking the package (which has 'reshape' as a dependency in the DESCRIPTION file) I get
2009 Jul 09
1
merge performace degradation in 2.9.1
I have noticed a significant performance degradation using merge in 2.9.1 relative to 2.8.1. Here is what I observed: N <- 100000 X <- data.frame(group=rep(12:1, each=N), mon=rep(rev(month.abb), each=N)) X$mon <- as.character(X$mon) Y <- data.frame(mon=month.abb, letter=letters[1:12]) Y$mon <- as.character(Y$mon) Z <- cbind(Y, group=1:12) system.time(Out
2008 Oct 22
1
R 2.8.0 qqnorm produces error with object of class zoo?
Dear list-reader, by running the following script: library(zoo) sessionInfo() search() packageDescription("zoo") data(EuStockMarkets) dax <- as.zoo(EuStockMarkets[1:10, "DAX"]) daxr <- diff(log(dax)) identical(as.vector(qnorm(daxr)), qnorm(coredata(daxr))) qqnorm(coredata(daxr)) qqnorm(daxr) qqnorm() produces an error: > qqnorm(daxr) Fehler in if (xi == xj) 0L
2017 Oct 15
2
Function 'factor' issues
In R devel, function 'factor' has been changed, allowing and merging duplicated 'labels'. Issue 1: Handling of specified 'labels' without duplicates is slower than before. Example: x <- rep(1:26, 40000) system.time(factor(x, levels=1:26, labels=letters)) Function 'factor' is already rather slow because of conversion to character. Please don't add slowdown.
2014 Sep 08
2
Problem with order() and I()
I have found that order() fails in a rather arcane circumstance, as in this example: > foo <- I( c('x','\265g') ) > order(foo) Error in if (xi > xj) 1L else -1L : missing value where TRUE/FALSE needed > foo <-c('x','\265g') > order(foo) [1] 1 2 > sessionInfo() R version 3.1.1 (2014-07-10) Platform: x86_64-apple-darwin13.1.0 (64-bit)
2013 Oct 03
1
version comparison puzzle
Can anyone explain what I'm missing here? max(pp1 <- package_version(c("0.99999911.3","1.0.4","1.0.5"))) ## [1] ?1.0.4? max(pp2 <- package_version(c("1.0.3","1.0.4","1.0.5"))) ## [1] ?1.0.5? I've looked at ?package_version , to no avail. Since max() goes to .Primitive("max") I'm having trouble figuring out
2010 May 05
1
testInstalledBasic question
Hi, I'm currently in the process of writing an R-installation SOP for my company. As part of that process I'm using the recommendations from the 'R Installation and Administration' document, section 3.2, "Testing an installation". This is done on an XP machine, using the latest binary of 2.11.0. The binary is downloaded and then installed from the installer. I then
2017 Oct 18
0
Function 'factor' issues
>>>>> Suharto Anggono Suharto Anggono via R-devel <r-devel at r-project.org> >>>>> on Sun, 15 Oct 2017 16:03:48 +0000 writes: > In R devel, function 'factor' has been changed, allowing and merging duplicated 'labels'. Indeed. That had been asked for and discussed a bit on this list from June 14 to June 23, starting at
2017 Oct 21
0
Function 'factor' issues
My idea (like in https://bugs.r-project.org/bugzilla/attachment.cgi?id=1540 ): - For remapping, use f <- match(xlevs, nlevs)[f] instead of f <- match(xlevs[f], nlevs) (I have mentioned it). - Remap only if length(nlevs) differs from length(xlevs) . On use of 'order' in function 'factor' in R devel, factor.Rd still says 'sort.list' in "Details" section. My
2009 Nov 27
0
Long execution time for quantile() and difftime objects (PR#14092)
This message is in MIME format. The first part should be readable text, while the remaining parts are likely unreadable without MIME-aware tools. --27501778-1317196171-1259330723=:5696 Content-Type: TEXT/PLAIN; CHARSET=ISO-8859-7; format=flowed Content-Transfer-Encoding: 8BIT Content-ID: <alpine.LFD.2.00.0911271406411.5696 at toucan.stats.ox.ac.uk> Did you read the help page?
2009 Nov 27
1
Long execution time for quantile() and difftime objects (PR#14091)
Full_Name: Hong Ooi Version: 2.10.0 OS: Windows XP Submission from: (NULL) (203.110.235.1) While trying to get summary statistics on a duration variable (the difference between a start and end date), I ran into the following issue. Using summary or quantile (which summary calls) on a difftime object takes an extremely long time if the object is even moderately large. A reproducible example: