similar to: Bug: floating point bug in nclass.FD can cause hist() to crash

Displaying 20 results from an estimated 1000 matches similar to: "Bug: floating point bug in nclass.FD can cause hist() to crash"

2017 May 18
0
Bug: floating point bug in nclass.FD can cause hist() to crash
I just got the same error message with > sessionInfo() R version 3.4.0 (2017-04-21) Platform: x86_64-apple-darwin15.6.0 (64-bit) Running under: macOS Sierra 10.12.4 Matrix products: default BLAS: /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib LAPACK: /Library/Frameworks/R.framework/Versions/3.4/Resources/lib/libRlapack.dylib
2002 Apr 09
0
couldn't find function "nclass.fd"
Dear list, I get the following message while computing truehist in R 1.4.1 on Redhat Linux 7.1: > truehist(lsk$Pox, nbins = "FD" , prob = TRUE, xlab = "Pox [mmol/kg]") Error in switch(casefold(nbins), scott = nclass.scott(data), "freedman-diaconis" = , : couldn't find function "nclass.fd" Maybe the "nclass.fd" should be
2008 Oct 20
3
The evaluation of optional function arguments
Dear R-helpers, I've got two functions; callTimes() calls times(), passing it an optional argument (bar) by name (bar=harry). times() then believes it has been passed a name, rather than a value ? but I want the value, not the name. Worse, if I evaluate the name, it is evaluated in the environment times() was defined, not where it is called. How can I call times(), defining its optional
2008 May 19
2
How hist() decides breaks?
Hi Folks, I'd like to know how hist() decides how many cells to use when it ignores my "suggestion" to use say 'hist(...,breaks=50)'. More specifically, I have the results of 10000 simulations, each returning an 8-vector, therefore 8 variables each with 10000 values. Some of these 8 have somewhat skew distributions. Say one of these 8 variables is X. I ask for H <-
2007 Dec 21
1
using apply to loop
Hi, I am running the following loop, but it takes hours to run as n is big. Is there any way "apply" can be used? Thanks. ### Start nclass <- dim(data)[[2]] - 1 z <- matrix(0, ncol = nclass, nrow = nclass) n <- dim(data)[[1]] x <- c(1:nclass) # loop starts for(loop in 1:n) { r <- data[loop, 1:nclass] classified <- x[r == max(r)]
2007 Dec 21
1
using apply to loop [SEC=UNCLASSIFIED]
Hi Louis, You could try this: # find the index of the maximum value in each row of _data_, # disregarding the last column classified <- apply(data[,-(nclass+1)],1,which.max) ## or, if the maximum may be repeated: classified <- apply(data[,-(nclass+1)], 1, FUN = function(x) which(x == max(x))) # the variable _truth_ is just the last column of _data_ ? truth <- data[,nclass + 1] ?table
2012 Oct 07
2
[PATCH] drm/nouveau: fix error handling in core/core object creation functions
Signed-off-by: Marcin Slusarz <marcin.slusarz at gmail.com> --- This patch relies on "drm/nouveau: remove >1 sclass support from nouveau_parent_create_". There are *many* *more* code paths without proper error handling - I counted at least 106 in 41 functions. If someone would like to do a bit of janitorial work I marked those code paths and uploaded "patch" here:
2012 Oct 07
1
[PATCH] drm/nouveau: remove >1 sclass support from nouveau_parent_create_
It's unused (only one codepath passes sclass at all and it's always one), broken (overwrites the same field, leaking previous one) and confusing. Signed-off-by: Marcin Slusarz <marcin.slusarz at gmail.com> --- drivers/gpu/drm/nouveau/core/core/client.c | 2 +- drivers/gpu/drm/nouveau/core/core/parent.c | 3 +--
1999 Nov 23
1
postscript colors
Is color specification like this available in R for setting postscript colors? > hue <- c(0, seq(from = 0, by = 1/(nclass), length = nclass)) > sat <- c(0, rep(1.0, nclass)) > bri <- c(0, rep(1.0, nclass)) > zcolors <- cbind(hue, sat, bri) > ps.options (setcolor=ps.setcolor.hsb,colors=zcolors)
2013 Mar 26
1
randomLCA_with error for me
Please can someone explain to me how to use randomLCA in R for an analysis. I tried using it and had this error (copied below) which indicated my patterns must consist of 0 or 1. I assume I am doing something wrong. Please help. > library(lattice) > library(boot) Attaching package: ‘boot’ The following object(s) are masked from ‘package:lattice’: melanoma > library(utils) >
2012 Oct 17
2
loop of quartile groups
Greetings R users, My goal is to generate quartile groups of each variable in my data set. I would like each experiment to have its designated group added as a subsequent column. I can accomplish this individually with the following code: brks <- with(data_variables, cut2(var2, g=4)) #I don't want the actual numbers, I need a numbered group data$test1=factor(brks,
2009 May 27
2
problem with centos upgrade
Dear All, I have a centos 5 server running my mail and Dns working fine but when i try to do u yum ugrade or yum update it gives me lots of perl errors and it terminates OS is centos 5 (final) the part of errors reported --------------------------------- file /usr/lib/perl5/5.8.8/Math/BigFloat.pm from install of perl-5.8.8-18.el5_3.1 conflicts with file from package perl-Math-BigInt-1.86-1
2012 Oct 25
5
system is computationally singular: reciprocal condition number
Hi folks, I know, this is a fairly common question and I am really disappointed that I could not find a solution. I am trying to calculate Mahanalobis distances in a data frame, where I have several hundreds groups and several hundreds of variables. Whatever I do, however I subset it I get the "system is computationally singular: reciprocal condition number" error. I know what it means
2010 Oct 26
2
Forcing results from lm into datframe
Hi I need some help getting results from multiple linear models into a dataframe. Let me explain the problem. I have a dataframe with ejection fraction results measured over a number of quartiles and grouped by base_study. My dataframe (800 different base_studies) looks like > afvtprelvefs basestudy quartile ef ef_std entropy CBP0908020 1 21.6 0.53 3.27
2008 Jun 13
2
Quartile regression question
I have data that looks like lake,loglength,logweight 1,2.369215857,1.929418926 1,2.426511261,2.230448921 1,2.434568904,2.298853076 1,2.437750563,2.298853076 1,2.442479769,2.230448921 1,2.445604203,2.356025857 ... 102,2.722633923,3.310268367 102,2.781755375,3.502153893 102,2.836324116,3.683407299 102,2.802773725,3.583312152 102,2.790285164,3.546419267 102,2.806179974,3.599118565
2009 Sep 22
5
use of class variable in r as in Proc means of sas
Hi,everyone i need to calculate quartile values of a variable grouped by the other variable . same as in aggregate function(only median,mean or functions is possible-i think so) Could you please help me to achieve the same for other quartile values(5,10,25,75,90) as for median using aggregate. Thanks in advance. data : zip price 60000 567000 60001 478654 60004 485647 60001
2010 Jan 22
2
Quartiles and Inter-Quartile Range
Why am I getting a wrong result for quartiles? here is my code: > cbiomass = c(910, 1058, 929, 1103, 1056, 1022, 1255, 1121, 1111, 1192, > 1074, 1415) > summary(cbiomass) > IQR(cbiomass) The result R gives me is: For the summary > Min. 1st Qu. Median Mean 3rd Qu. Max. 910 1048 1088 1104 1139 1415 For IQR > 91.25 ********* The true Q1 is 1039
2004 Mar 26
1
color.ramp in maptools
Dear list members, I am trying to use the maptools library to display geographical data. At the moment I have some trouble understanding how the " auxvar " variable is supposed to be used in the plot.Map function. I am using R Version 1.8.1 (2003-11-21) on Linux Looking at the plot.Map function itself, I see that it calls a color.ramp function (I am reporting only the relevant
2003 Oct 28
4
random number generation
Hi every one, I am trying to generate a normally distributed random variable with the following descriptive statistics, min=1, max=99, variance=125, mean=38.32, 1st quartile=38, median=40, 3rd quartile=40, skewness=-0.274. I know the "rnorm" will allow me to simulate random numbers with mean 38.32 and Sd=11.18(sqrt(125)). But I need to have the above mentioned descriptive
2007 Oct 09
3
Summary vs fivenum results for Q3
I've just started using R and am still a neophyte, but I found the following curious result. I'm using the current version of R (2.5.1 (2007-06-27) ). Why are the results for the third quartile different in the output from the summary and fivenum commands? For the following data set 457 514 530 530 538 560 687 745 745 778 786 790 792