Displaying 9 results from an estimated 9 matches for "3.999999".
Did you mean:
3.9999999
2020 Jan 13
3
choose(n, k) as n approaches k
This struck me as incorrect:
> choose(3.999999, 4)
[1] 0.9999979
> choose(3.9999999, 4)
[1] 0
> choose(4, 4)
[1] 1
> choose(4.0000001, 4)
[1] 4
> choose(4.000001, 4)
[1] 1.000002
Should base::choose(n, k) check whether n is within machine precision of k and return 1?
Thanks,
Erik
***
sessionInfo()
R version 3.6.0 beta (2019-04-15 r76395)
Platform: x86_64-apple-darwin15.6.0
2020 Jan 13
3
choose(n, k) as n approaches k
This struck me as incorrect:
> choose(3.999999, 4)
[1] 0.9999979
> choose(3.9999999, 4)
[1] 0
> choose(4, 4)
[1] 1
> choose(4.0000001, 4)
[1] 4
> choose(4.000001, 4)
[1] 1.000002
Should base::choose(n, k) check whether n is within machine precision of k and return 1?
Thanks,
Erik
***
sessionInfo()
R version 3.6.0 beta (2019-04-15 r76395)
Platform: x86_64-apple-darwin15.6.0
2020 Jan 14
2
[R] choose(n, k) as n approaches k
> On 14 Jan 2020, at 16:21 , Duncan Murdoch <murdoch.duncan at gmail.com> wrote:
>
> On 14/01/2020 10:07 a.m., peter dalgaard wrote:
>> Yep, that looks wrong (probably want to continue discussion over on R-devel)
>> I think the culprit is here (in src/nmath/choose.c)
>> if (k < k_small_max) {
>> int j;
>> if(n-k < k
2020 Jan 14
1
[R] choose(n, k) as n approaches k
Yep, that looks wrong (probably want to continue discussion over on R-devel)
I think the culprit is here (in src/nmath/choose.c)
if (k < k_small_max) {
int j;
if(n-k < k && n >= 0 && R_IS_INT(n)) k = n-k; /* <- Symmetry */
if (k < 0) return 0.;
if (k == 0) return 1.;
/* else: k >= 1 */
if n is a near-integer, then k
2020 Jan 14
4
[R] choose(n, k) as n approaches k
OK, I see what you mean. But in those cases, we don't get the catastrophic failures from the
if (k < 0) return 0.;
if (k == 0) return 1.;
/* else: k >= 1 */
part, because at that point k is sure to be integer, possibly after rounding.
It is when n-k is approximately but not exactly zero and we should return 1, that we either return 0 (negative case) or n
2020 Jan 14
0
[R] choose(n, k) as n approaches k
On 14/01/2020 10:07 a.m., peter dalgaard wrote:
> Yep, that looks wrong (probably want to continue discussion over on R-devel)
>
> I think the culprit is here (in src/nmath/choose.c)
>
> if (k < k_small_max) {
> int j;
> if(n-k < k && n >= 0 && R_IS_INT(n)) k = n-k; /* <- Symmetry */
> if (k < 0) return 0.;
2020 Jan 14
0
[R] choose(n, k) as n approaches k
On 14/01/2020 10:50 a.m., peter dalgaard wrote:
>
>
>> On 14 Jan 2020, at 16:21 , Duncan Murdoch <murdoch.duncan at gmail.com> wrote:
>>
>> On 14/01/2020 10:07 a.m., peter dalgaard wrote:
>>> Yep, that looks wrong (probably want to continue discussion over on R-devel)
>>> I think the culprit is here (in src/nmath/choose.c)
>>> if (k
2020 Jan 15
1
[R] choose(n, k) as n approaches k
That crossed my mind too, but presumably someone designed choose() to handle the near-integer cases specially. Otherwise, we already have beta() -- you just need to remember what the connection is ;-).
I would expect that it has to do with the binomial and negative binomial distributions, but I can't offhand picture a calculation that leads to integer k, n plus/minus a tiny numerical error
2020 Jan 14
0
[R] choose(n, k) as n approaches k
At the risk of throwing oil on a fire. If we are talking about fractional values of choose() doesn't it make sense to look to the gamma function for the correct analytic continuation? In particular k<0 may not imply the function should evaluate to zero until we get k<=-1.
Example:
``` r
choose(5, 4)
#> [1] 5
gchoose <- function(n, k) {
gamma(n+1)/(gamma(n+1-k) * gamma(k+1))