similar to: Trailing zero's missing from signif function ?

Displaying 20 results from an estimated 10000 matches similar to: "Trailing zero's missing from signif function ?"

2003 Mar 14
1
Formatting significant digits with trailing zeros
I need a function like signif(), but returns the rounded values as character strings, formatted with trailing zeros where appropriate. If anyone has one, I would sure appreciate a copy. Thanks -Don Details: signif() rounds a number to a specified number of significant digits, for example: > x <- c(2.503,2.477,0.1204) > signif(x[1],3) [1] 2.5 > signif(x[2],3) [1] 2.48
2009 May 20
2
round function seems to produce maximum 2 decimals
I am trying to use round()to force R to display a specific number of decimals, but it seems to display <=2 decimals no matter what I specify in the digits argument. As an alternative I tried signif(), but it also produces unexpected results. See example code and results below. Format() works, but then the result no longer is numeric. Am I missing something simple? I am using R 2.9.0 on Windows
2009 Nov 10
1
2 significant digits
Hi, How to represent a rounded number ending with 0 with 2-significant digits? If I have for ex, 0.8031 and I use signif or round with digits = 2, I'll get 0.8. If I use format, I get character type (even if I pass number as parameter) and if I convert with as.numeric, I'll lose one significant digit (0): > format(13.7, nsmall = 2) [1] "13.70" > as.numeric( format(13.7,
2008 Jun 02
6
significant digits (PR#9682)
I came to report this same bug and found it already in the trash, but I slightly disagree with that assessment. If it's not a bug, then perhaps it's a feature request. Comments at the end. On Mon, May 14, 2007, Duncan Murdoch wrote: >>On 13/05/2007 8:46 PM, scott.wilkinson at csiro.au wrote: >> >> In the example below round() does not report to the specified number of
1999 Feb 26
1
Re: trailing zeroes
> Date: Fri, 26 Feb 1999 09:59:39 +0000 > From: Bendix Carstensen <bxc at svs.dk> > When you require 2 digits you expect to find 5.96 printed Correction, _you_ expect! Very few computer programs do that. You cannot `require' two digits by options(digits=2): ?options says digits: controls the number of digits to print when print- ing numeric values. It is a
2007 May 14
1
round(#, digits=x) unreliable for x=2 (PR#9682)
Full_Name: Scott Wilkinson Version: 2.3.1 OS: WinXP Pro Submission from: (NULL) (140.253.203.4) In the example below round() does not report to the specified number of digits when the last digit to be reported is zero: Compare behaviour for 0.897575 and 0.946251. Ditto for signif(). The number of sigfigs is ambiguous unless the reader knows this behaviour. Is this a bug or intended behaviour? Is
2006 Feb 14
1
weird behavior of nsmall in format
>From the help page of format, nsmall should control the number of digits. > format(0.123456789, nsmall = 10) [1] "0.1234567890" > format(0.123456789, nsmall = 1) [1] "0.1234568" > format(0.123456789, nsmall = 2) [1] "0.1234568" > format(0.123456789, nsmall = 8) [1] "0.12345679" It adds zeros fine but for
2001 Sep 25
2
glm.nb, anova.negbin
Dear R-collegues, I'm getting an error message (Error in round) when summarising a glm.nb model, and when using anova.negbin (in R 1.3.1 for windows): > m.nb <- glm.nb(tax ~ areal) > m.bn Call: glm.nb(formula = tax ~ areal, init.theta = 5.08829537115498, link = log) Coefficients: (Intercept) areal 3.03146 0.03182 Degrees of Freedom: 283 Total (i.e. Null); 282
2005 Apr 21
1
printCoefmat(signif.legend =FALSE) (PR#7802)
printCoefmat(signif.legend =FALSE) does not work properly. The option "signif.legend = FALSE" is ignored as shown in the example below. cmat <- cbind(rnorm(3, 10), sqrt(rchisq(3, 12))) cmat <- cbind(cmat, cmat[,1]/cmat[,2]) cmat <- cbind(cmat, 2*pnorm(-cmat[,3])) colnames(cmat) <- c("Estimate", "Std.Err", "Z value", "Pr(>z)") #
2005 Jan 19
2
signif() generic
Dear list, I'm trying to write a class for Gaussian error propagation of measured values and their (estimated) errors, > setClass("sec", representation(val="numeric", err="numeric")) I've already successfully implemented basic arithmetics using mostly the "Arith" group generics. But I'm running into trouble when trying to get signif() to
2003 Feb 06
1
signif {base}: changes to scientific notation
PROBLEM `signif' does change to scientic notation at different levels depending on the number of significant digits in the input. This can generate tables where figures change ``irregularly'' from normal to scientific notation. PROPOSAL The change to the scientific notation should be made only if the figure in scientific notation - with potentially as
2019 Mar 27
1
default for 'signif.stars'
Dear R-Devel, As I am sure many of you know, a special issue of The American Statistician just came out, and its theme is the [mis]use of P values and the many common ways in which they are abused. The lead editorial in that issue mentions the 2014 ASA guidelines on P values, and goes one step further, by now recommending that the words "statistically significant" and related simplistic
1997 May 27
1
R-alpha: signif( small , d) gives NA
signif(.) is a <primitive> function. Unfortunately, I couldn't even find WHERE in the source, signif(.) is defined. Here are the symptoms: xmin <- .Machine $ double.xmin signif(xmin,3) #--> NA umach <- unlist(.Machine)[paste("double.x", c("min","max"), sep='')] for(dig in 1:10) {cat("dig=",dig,": ");
2005 May 22
3
comparison operator, decimals, and signif()
Hi, I recently spent quite a bit of time trouble shooting a function that I had written only to discover that the problem I was having was with the comparison operator. I assumed that the following would return TRUE: > testMean <- 82.8 + 0.1 > testMean [1] 82.9 > testMean == 82.9 [1] FALSE Apparently this has to do with deciml places. Look: > newTest <- 82.0 > newTest [1]
2010 Jul 14
1
Add Significance Codes to Data Frame
I was hoping that there might be some way to attach significance code like the ones from summary.lm to a dataframe. Anyone know how to do something like that. Here is the function i'd like to add that functionality to: add1.coef <- function(model,scope,test="F",p.value=1,order.by.p=FALSE) { num <- length(model$coefficients) add <- add1(model,scope,test=test) sub <-
2009 Dec 21
3
Signif. codes
My question is about the "Signif. codes" and the p-value, specifically, the output when I run summary(nameofregression.lm) So you get this little key: Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 And on a regression I ran, next to the intercept data, I get '***' Coefficients: > > Estimate Std. Error t value Pr(>|t|) > >
2009 Dec 22
1
Sweave: font problems with Signif. codes lines
[Environment: Win Xp, Miktex 2.7, R 2.9.2] In an Sweave document, I'm displaying the results of car:::Anova() tests, that look like this in the generated .tex file: \begin{Soutput} Type III MANOVA Tests: Pillai test statistic Df test stat approx F num Df den Df Pr(>F) (Intercept) 1 0.86 90.38 4 60 <2e-16 *** --- Signif. codes: 0 ?***? 0.001 ?**? 0.01 ?*? 0.05 ?.? 0.1 ? ? 1
2019 Mar 28
1
default for 'signif.stars'
I read through the editorial. This is the one of the most mega-ultra-super-biased articles I've ever read. e.g. The authors encourage Baysian methods, and literally encourage subjective approaches. However, there's only one reference to robust methods and one reference to nonparametric methods, both of which are labelled as purely exploratory methods, which I regard as extremely
2009 Jan 31
1
display p-values and significance levels
Hi there, I got a piece of code for the Iris data which allows to display correlation coefficients for each Iris species in the lower panel (color coded). I would now like to add e.g. a "*" to show the significance of each correlation next to the correlation coefficient. Furthermore I would like to make a t.test between the species "setosa" and "versicolor" for
2005 Aug 21
2
bizarre signif stars in Sweave latex
OK. I give up. I'll ask a stupid question. How do I get the $!#@*$ signif stars line printed by summaries to not look extremely bizarre in the latex produced by Sweave? For example, see p. 7 of http://www.stat.umn.edu/geyer/aster/library/aster/doc/tutor.pdf I can see what the problem is. R emits non-ascii characters (as it is supposed to do), Sweave puts them in the tex file, and