similar to: the computational complexity of sample()

Displaying 20 results from an estimated 4000 matches similar to: "the computational complexity of sample()"

2015 Sep 09
0
sample.int() algorithms
I was experiencing a strange pattern of slowdowns when using sample.int(), where sampling from a one population would sometimes take 1000x longer than taking the same number of samples from a slightly larger population. For my application, this resulted in a runtime of several hours rather than a few seconds. Looking into it, I saw that sample.int() is hardcoded to switch algorithms when the
2002 Aug 30
5
density() returns a density function that does not add up to 1
Dear R users, I ran into this curious problem: > d <- rnorm(100) > d.density <- density(d) > sum( d.density$x * d.density$y) [1] 2.517502 Admittedly the method of computing the mass under the density curve at line 3 is crude. But 2.5 is pretty far from 1, the value it should be. I tried a few other dataset and got similar result. Am I missing something obvious? Or is the return
2017 Aug 18
1
Issues of R_pretty in src/appl/pretty.c
Examples similar to pretty(c(-1,1)*1e300, n = 1e9, min.n = 1) with smaller 'n': pretty(c(-1,1)*1e304, n = 1e5, min.n = 1) pretty(c(-1,1)*1e306, n = 1e3, min.n = 1) A report on 'pretty' when working with integers, similar to what led to change of 'seq' fuzz, is https://bugs.r-project.org/bugzilla3/show_bug.cgi?id=15137 -------------------------------------------- On Tue,
2017 Oct 03
0
Revert to R 3.2.x code of logicalSubscript in subscript.c?
Suharto, If you're interested in performance with subscripting, you might want to look at pqR (pqR-project.org). It has some substantial performance improvements for subscripting over R Core versions. This is especially true for the current development version of pqR (probably leading to a new release in about a month). You can look at a somewhat-stable snapshot of recent pqR development
2009 Mar 27
1
Out of memory crash on 64-bit Fedora
Greetings all, First of all, thanks to all of you for creating such a useful, powerful program. I regularly work with very large datasets (several GB) in R in 64-bit Fedora 8 (details below). I'm lucky to have 16GB RAM available. However if I am not careful and load too much into R's memory, I can crash the whole system. There does not seem to be a check in place that will stop R from
2002 Jul 09
1
RE: mvtnorm package installation failure
Hi, Thank you for the tip. I tried to re-intall R from Debian "stable", in which R's version is 1.4.0. And the installation of "mvtnorm" works. I then re-installed R yet again from Debian "unstable" (woody), in which R's version is 1.5.1. The installation of "mvtnorm" fails again with the same error message. Another package that failed with the
2001 Nov 06
1
R-devel & ATLAS generates Dr. Watson on NT (was RE: Look, Wa tson! La.svd & ATLAS)
Prof. Bates & R-devel, I've done more test with the following results: I have two versions of ATLAS 3.2.1. One was compiled on my old Thinkpad 600E (PII), the other was compiled on my new Thinkpad T22 (PIIISSE1). I compiled R-devel dated 10/31, 11/01 and 11/04, linked against either of the two ATLAS libs. All gave Dr. Watson when given this code: La.svd(matrix(runif(1e5), 1e3,
2012 May 06
2
unlist crashes 32-bit R on WinXP when use.names=TRUE
Hi all, I experienced a crash in R-2.15.0 on 32-bit Windows XP (sessionInfo below) when running the piece of code below. I cannot replicate the error on 64-bit Linux, 64-bit Windows, or 32-bit R running under 64-bit Windows. I do not have, and could not find, a 32-bit version of Linux to test this. > NOW <- Sys.time() > FUTURE <- NOW+1:1e7 > crash <- as.character(FUTURE)
2005 Feb 11
1
Re: [R-SIG-Mac] Bug running pbinom() in R-GUI?
On Feb 10, 2005, at 7:38 PM, George W. Gilchrist wrote: > Today I was running a graduate level stats lab using R and we > encountered a > major problem while using the current build of the Cocoa GUI: > >> From the GUI: >> system.time(pbinom(80, 1e5, 806/1e6)) > [1] 14.37 4.94 30.29 0.00 0.00 >> > >> From the command line on the same machine: >>
2020 Sep 08
0
Operations with long altrep vectors cause segfaults on Windows
>>>>> Hugh Parsonage >>>>> on Tue, 8 Sep 2020 18:08:11 +1000 writes: > I can only reproduce on Windows, but reliably (both 4.0.0 and 4.0.2): > $> R --vanilla > x <- c(0L, -2e9:2e9) > # > Segmentation fault > Tried to reproduce on Linux but the above worked as expected. Not an > issue merely with the length of
2012 Feb 02
1
pgfSweave doesn't lazyload my objects
Hi all, I'm struggling a bit to get pgfSweave to lazyload objects when compiling a .Rnw file for a second time. Caching works fine except that for every run all objects get cached again and again. I've used cacheSweave which works fine; all cached objects from code-chunks with option cache = TRUE are lazy loaded. I've tried it on two machines ... I'm pretty sure I'm
2019 Mar 01
1
issue with sample in R 3.6.0.
Hello, I think there is an issue in the sampling rejection algorithm in R 3.6.0. The do_sample2 function in src/main/unique.c still has 4.5e15 as an upper limit, implying that numbers greater than INT_MAX are still to be supported by sample in base R. Please review the examples below: set.seed(123) max(sample(2^31, 1e5)) [1] 2147430096 set.seed(123) max(sample(2^31 + 1, 1e5)) [1] 1
2017 Aug 11
2
Issues of R_pretty in src/appl/pretty.c
See https://stat.ethz.ch/pipermail/r-devel/2017-August/074746.html for the origin of the example here. That pretty(c(-1,1)*1e300, n = 1e9, min.n = 1) gave 20 intervals, far from 1e9, but pretty(c(-1,1)*1e300, n = 1e6, min.n = 1) gave 1000000 intervals (on a machine), made me trace through the code to function 'R_pretty' in https://svn.r-project.org/R/trunk/src/appl/pretty.c . *lo is
2020 Sep 08
0
Operations with long altrep vectors cause segfaults on Windows
Thanks Martin. On further testing, it seems that the segmentation fault can only occur when the amount of obtainable memory is sufficiently high. On my machine (admittedly with other processes running): $ R --vanilla --max-mem-size=30G -e "x <- c(0L, -2e9:2e9)" Segmentation fault $ R --vanilla --max-mem-size=29G -e "x <- c(0L, -2e9:2e9)" Error: cannot allocate vector
2020 Sep 08
0
[External] Re: Operations with long altrep vectors cause segfaults on Windows
On Tue, 8 Sep 2020, Martin Maechler wrote: >>>>>> Martin Maechler >>>>>> on Tue, 8 Sep 2020 10:40:24 +0200 writes: > >>>>>> Hugh Parsonage >>>>>> on Tue, 8 Sep 2020 18:08:11 +1000 writes: > > >> I can only reproduce on Windows, but reliably (both 4.0.0 and 4.0.2): > > >> $> R
2010 Mar 25
2
print(big+small*1i) -> big + 0i
Should both parts of a complex number be printed to the same precision? The imaginary part of 0 looks a bit odd when log10(real/imag) >=~ getOption("digits"), but I'm not sure it is awful. Some people might expect the same number of significant digits in the two parts. > 1e7+4i [1] 10000000+0i > 1e7+5i [1] 10000000+0i > 1e10 + 1000i [1] 1e+10+0e+00i >
2017 Aug 14
0
Issues of R_pretty in src/appl/pretty.c
>>>>> Suharto Anggono Suharto Anggono via R-devel <r-devel at r-project.org> >>>>> on Fri, 11 Aug 2017 17:11:06 +0000 writes: >>>>> Suharto Anggono Suharto Anggono via R-devel <r-devel at r-project.org> >>>>> on Fri, 11 Aug 2017 17:11:06 +0000 writes: > See
2020 Sep 08
0
[External] Re: Operations with long altrep vectors cause segfaults on Windows
Unfortunately I only get [Thread 21752.0x4aa8 exited with code 3221225477] [Thread 21752.0x4514 exited with code 3221225477] [Thread 21752.0x3f10 exited with code 3221225477] [Inferior 1 (process 21752) exited with code 030000000005] (I'm guessing I would need to build an instrumented version of R, or can R be debugged using gdb with an off-the-shelf installation?) On Wed, 9 Sep 2020 at
2020 Sep 08
1
[External] Re: Operations with long altrep vectors cause segfaults on Windows
>>>>> luke-tierney >>>>> on Tue, 8 Sep 2020 09:42:43 -0500 (CDT) writes: > On Tue, 8 Sep 2020, Martin Maechler wrote: >>>>>>> Martin Maechler >>>>>>> on Tue, 8 Sep 2020 10:40:24 +0200 writes: >> >>>>>>> Hugh Parsonage >>>>>>> on Tue, 8 Sep 2020
2008 Apr 10
1
ISOdate/ISOdatetime performance suggestions, other date/time questions
Dear list: working with date/times I have come across a problem that ISOdate and ISOdatetime are too slow on large vectors of data. I was surprised just until I looked at the implementation and the man page: "ISOdatetime and ISOdate are convenience wrappers for strptime". In other terms, they convert data to character representation first in order to create a POSIXlt object that is then