similar to: round() and signif() do not check argument names when a single argument is given

Displaying 20 results from an estimated 20000 matches similar to: "round() and signif() do not check argument names when a single argument is given"

2020 May 22
0
round() and signif() do not check argument names when a single argument is given
Hi, I was told to send this to the -devel list instead of posting to bugzilla. When round our signif are called with a single named argument, R does not check the name and runs the function with that named argument directly as the first argument, using 0.0 or 6.0 (6 in the case of signif) for the second argument. Not checking the argument name is at odds with how all other primitive functions
2009 Dec 22
1
Sweave: font problems with Signif. codes lines
[Environment: Win Xp, Miktex 2.7, R 2.9.2] In an Sweave document, I'm displaying the results of car:::Anova() tests, that look like this in the generated .tex file: \begin{Soutput} Type III MANOVA Tests: Pillai test statistic Df test stat approx F num Df den Df Pr(>F) (Intercept) 1 0.86 90.38 4 60 <2e-16 *** --- Signif. codes: 0 ?***? 0.001 ?**? 0.01 ?*? 0.05 ?.? 0.1 ? ? 1
2006 Apr 11
2
concat results from db query
I''ve got a bunch of records coming back from a database like so ID WORDS 1. banana apple orange 2. apple pear 3. banana orange I want to take the records in the WORDS field and concatinate them into one large array so I can play around with it. How do I do this? -- Posted via http://www.ruby-forum.com/.
2016 Sep 21
2
Undocumented 'use.names' argument to c()
'c' has an undocumented 'use.names' argument. I'm not sure if this is a documentation or implementation bug. > c(a = 1) a 1 > c(a = 1, use.names = F) [1] 1 Karl
2016 Sep 23
2
Undocumented 'use.names' argument to c()
In S-PLUS 3.4 help on 'c' (http://www.uni-muenster.de/ZIV.BennoSueselbeck/s-html/helpfiles/c.html), there is no 'use.names' argument. Because 'c' is a generic function, I don't think that changing formal arguments is good. In R devel r71344, 'use.names' is not an argument of functions 'c.Date', 'c.POSIXct' and 'c.difftime'. Could
2016 Sep 23
0
Undocumented 'use.names' argument to c()
In Splus c() and unlist() called the same C code, but with a different 'sys_index' code (the last argument to .Internal) and c() did not consider an argument named 'use.names' special. > c function(..., recursive = F) .Internal(c(..., recursive = recursive), "S_unlist", TRUE, 1) > unlist function(data, recursive = T, use.names = T) .Internal(unlist(data, recursive
2016 Sep 23
0
Undocumented 'use.names' argument to c()
I'd expect that a lot of the performance overhead could be eliminated by simply improving the underlying code. IMHO, we should ignore it in deciding the API that we want here. On Fri, Sep 23, 2016 at 10:54 AM, Henrik Bengtsson <henrik.bengtsson at gmail.com> wrote: > I'd vote for it to stay. It could of course suprise someone who'd > expect c(list(a=1), b=2, use.names =
2015 Mar 04
3
Domain Member Server (wheezy) - Unable to edit permissions of share without usermapping - shall I add to Wiki?
Hello again Rowland, list! Sorry for the delayed response, and top posting. To recap: I'd like to complete the member server wiki so that ACLs can be set from windows without taking undocumented steps. The three ways I've found to do this are: 1) map root to administrator. (LPH VanBelle's script uses this option.) 2) chmod 0775 then chgrp "<DOMAIN>\Domain Admins"
2016 Sep 23
2
Undocumented 'use.names' argument to c()
I'd vote for it to stay. It could of course suprise someone who'd expect c(list(a=1), b=2, use.names = FALSE) to generate list(a=1, b=2, use.names=FALSE). On the upside, is the performance gain from using use.names=FALSE. Below benchmarks show that the combining of the names attributes themselves takes ~20-25 times longer than the combining of the integers themselves. Also, at no
2016 Sep 25
1
Undocumented 'use.names' argument to c()
>From comments in http://stackoverflow.com/questions/24815572/why-does-function-c-accept-an-undocumented-argument/24815653 : The code of c() and unlist() was formerly shared but has been (long time passing) separated. From July 30, 1998, is where do_c got split into do_c and do_unlist. With the implementation of 'c.Date' in R devel r71350, an argument named 'use.names' is
2003 Feb 06
1
signif {base}: changes to scientific notation
PROBLEM `signif' does change to scientic notation at different levels depending on the number of significant digits in the input. This can generate tables where figures change ``irregularly'' from normal to scientific notation. PROPOSAL The change to the scientific notation should be made only if the figure in scientific notation - with potentially as
2005 Apr 21
1
printCoefmat(signif.legend =FALSE) (PR#7802)
printCoefmat(signif.legend =FALSE) does not work properly. The option "signif.legend = FALSE" is ignored as shown in the example below. cmat <- cbind(rnorm(3, 10), sqrt(rchisq(3, 12))) cmat <- cbind(cmat, cmat[,1]/cmat[,2]) cmat <- cbind(cmat, 2*pnorm(-cmat[,3])) colnames(cmat) <- c("Estimate", "Std.Err", "Z value", "Pr(>z)") #
1997 May 27
1
R-alpha: signif( small , d) gives NA
signif(.) is a <primitive> function. Unfortunately, I couldn't even find WHERE in the source, signif(.) is defined. Here are the symptoms: xmin <- .Machine $ double.xmin signif(xmin,3) #--> NA umach <- unlist(.Machine)[paste("double.x", c("min","max"), sep='')] for(dig in 1:10) {cat("dig=",dig,": ");
2013 Oct 02
0
For numeric x, as.character(x) doesn't always match signif(x, 15)
I saw something like this. > x <- 5180000000000003 > print(x, digits=20) [1] 5180000000000003 > as.character(x) [1] "5.18e+15" I thought it was because, when x is numeric, as.character(x) represents x rounded to 15 significant digits. > print(signif(x, 15), digits=20) [1] 5180000000000000.0000 > as.numeric(as.character(x)) == signif(x, 15) [1] TRUE The documentation
2005 Jan 19
2
signif() generic
Dear list, I'm trying to write a class for Gaussian error propagation of measured values and their (estimated) errors, > setClass("sec", representation(val="numeric", err="numeric")) I've already successfully implemented basic arithmetics using mostly the "Arith" group generics. But I'm running into trouble when trying to get signif() to
2014 Jan 06
1
Signif. codes
My question is about the "Signif. codes" , the output when I run matcoef =cbind(fit$par, se.coef,tval,2*(1-pnorm(abs(tval)))) dimnames(matcoef)=list(names(tval),c("Estimate","Std.Error","t value","pr(>|t|)")) cat("\nCoefficient(s):\n") printCoefmat(matcoef, digits=4, signif.stars = TRUE) Coefficient(s): Estimate
2019 Mar 27
1
default for 'signif.stars'
Dear R-Devel, As I am sure many of you know, a special issue of The American Statistician just came out, and its theme is the [mis]use of P values and the many common ways in which they are abused. The lead editorial in that issue mentions the 2014 ASA guidelines on P values, and goes one step further, by now recommending that the words "statistically significant" and related simplistic
2019 Mar 28
0
default for 'signif.stars'
Hi Martin, I take your point - but I'd argue that significance stars are a clumsy solution to the very real problem that you outline, and their inclusion as a default sends a signal about their appropriateness that I would prefer R not to endorse. My preference (to the extent that it matters) would be to see the significance stars be an option but not a default one, and the addition of
2019 Mar 28
1
default for 'signif.stars'
I read through the editorial. This is the one of the most mega-ultra-super-biased articles I've ever read. e.g. The authors encourage Baysian methods, and literally encourage subjective approaches. However, there's only one reference to robust methods and one reference to nonparametric methods, both of which are labelled as purely exploratory methods, which I regard as extremely
2010 May 20
2
Trailing zero's missing from signif function ?
Hello. In my opinion the function signif(1.4,digits=3) should give 1.40 but actually gives 1.4. Is there a magic way to add the trailing digits back (and converting to chracter at the same time ?) Regards, Paul.