search for: 0.000001

Displaying 20 results from an estimated 63 matches for "0.000001".

Did you mean: 4.000001
2001 Aug 21
4
looking for a smarter way
I have two problems where I've come up with some code that will do the analysis that I want, but it looks pretty clumsy. In the first case, I calculate the variance on five different columns for each of 14 clusters and get them into one matrix. I get the job done, but I would have thought that it could be done in one or two lines, not six, and be generalized so that it didn't matter how
2013 Apr 24
1
Floating point precision causing undesireable behaviour when printing as.POSIXlt times with microseconds?
Dear list, When using as.POSIXlt with times measured down to microseconds the default format.POSIXlt seems to cause some possibly undesirable behaviour: According to the code in format.POSIXlt the maximum accuracy of printing fractional seconds is 1 microsecond, but if I do; options( digits.secs = 6 ) as.POSIXlt( 1.000002 , tz="", origin="1970-01-01") as.POSIXlt( 1.999998 ,
2008 Sep 02
2
nls.control()
All - I have data: TL age 388 4 418 4 438 4 428 5 539 10 432 4 444 7 421 4 438 4 419 4 463 6 423 4 ... [truncated] and I'm trying to fit a simple Von Bertalanffy growth curve with program: #Creates a Von Bertalanffy growth model VonB=nls(TL~Linf*(1-exp(-k*(age-t0))), data=box5.4, start=list(Linf=1000, k=0.1, t0=0.1), trace=TRUE) #Scatterplot of the data plot(TL~age, data=box5.4,
2012 Sep 12
2
Deadlock in btrfs-cleaner, related to snapshot deletion
Hello, (this is a recap of yesterday''s discussion on BTRFS IRC, also to save relevant pastes before pastebins expire) I have my /home on btrfs; a cronjob makes one snapshot every 30 minutes; these snapshots are kept for 24-48 hours, then deleted in batches. This is a 16K Leaf/Node BTRFS on top of mdadm RAID1. As system uptime approached 2 weeks, I started noticing that the free space
2007 May 09
3
Increasing precision of rgenoud solutions
Dear All I am using rgenoud to solve the following maximization problem: myfunc <- function(x) { x1 <- x[1] x2 <- x[2] if (x1^2+x2^2 > 1) return(-9999999) else x1+x2 } genoud(myfunc, nvars=2, Domains=rbind(c(0,1),c(0,1)),max=TRUE,boundary.enforcement=2,solution.tolerance=0.000001) How can one increase the precision of the solution $par [1] 0.7072442 0.7069694 ? I
2010 Nov 18
9
Interesting problem with write data.
Hi, Recently, I made a btrfs to use. And I met slowness problem. Trying to diag it. I found this: 1. dd if=/dev/zero of=test count=1024 bs=1MB This is fast, at about 25MB/s, and reasonable iowait. 2. dd if=/dev/zero of=test count=1 bs=1GB This is pretty slow, at about 1.5MB/s, and 90%+ iowait, constantly. May I know why it works like this? Thanks. -- To unsubscribe from this list: send the
2007 Aug 21
2
Variable c and function c
I have found the error in my script which was semi-automatically translated from the other person's MATLAB code. The error is that c was assigned a value inside a function. That is the function body contained the following instructions c<-nw*czr d<-nw*cz rFren<-0.5*(abs((cz-c)/(cz+c))^2+abs((d-czr)/(d+czr))^2) firstguess<-c( 0,0,0,3,0.5, 0 , 0 , 0.000001) I have
2011 May 17
0
Help fit 5 nonlinear models. - Plant growth curves
Hi!! Can anyone help me, i have problems to converge the following data with 5 nonlinears models that i evaluated. Firtly, i send my data (totalsinatipicos) that i just try to fit with the nonlinear models. Next, i have the following script where i called the data as totalsinatipicos. I made selfstarting each nonlinear model. ###Library library(NRAIA) ###Data d<-totalsinatipicos
2013 Jun 10
1
btrfs-cleaner Blocked on xfstests 068
I''m running into a problem with the btrfs-cleaner thread becoming blocked on xfstests 068. The test locks up indefinitely without completing (normally it finished in about 45 seconds on my test box). I''ve replicated the issue on 3.10.0_rc5 and the for-linus branch of 3.9.0. I ran a git bisect on the 3.9.0 for-linus branch, and tracked my issue to the following commit: commit
2009 Sep 23
1
Maximum Likelihood Est. regarding the degree of freedom of a multivariate skew-t copula
Hello, I have a bigger problem in calculating the Maximum Likelihood Estimator regarding the degree of freedom of a multivariate skew-t copula. First of all I would like to describe what this is all about, so that you can understand my problem: I have 2 time series with more than 3000 entries each. I would like to calculate a multivariate skew-t Copula that fits this time series. Notice:
2010 Sep 15
1
optim with BFGS--what may lead to this, a strange thing happened
Dear R Users on a self-written function for calculating maximum likelihood probability (plz check function code at the bottom of this message), one value, wden, suddenly jump to zero. detail info as following: w[11]=2.14 lnw =2.37 2.90 3.76 ... regw =1.96 1.77 1.82 .... wden=0.182 0.178 0.179... w[11]=2.14 lnw=2.37 2.90 3.76 ... regw =1.96 1.77 1.82 .... wden=0.182
2009 Mar 30
0
Problem in S4 object displaying from within a Java application using JRI
I am using JRI (Java R Interface) library in order to call R from within my Java application. But since the "rmu1" and "rmu2" ,see the following code, are objects of type S4 once i run the application the value of Null will be returned for both of them. On this regard, i would appreciate it if anyone can tell me how i am going to display and/ or convert these objects to Java
2010 Oct 20
2
number format, writing 1e-5 instead of 0.00001
Hello I've used read.table to read a file that contains numbers such as 0.00001 when I write them back with write.table those numbers appear as 1e-5 How can I keep the old format? thanks -- View this message in context: http://r.789695.n4.nabble.com/number-format-writing-1e-5-instead-of-0-00001-tp3003831p3003831.html Sent from the R help mailing list archive at Nabble.com.
2012 Apr 24
0
mvpart versus SPSS
I have a question relating to mvpart, which I hope you can answer. We recently conducted a study using TBR. In our first study, we used "regular" TBR in SPSS to model 1 dependent variable. Note we have a relatively small data-set of 100 cases. In SPSS, we used a minimum change of improvement smaller than 0.000001 as a stopping rule. Also, we chose the 1SE "rule", set the
2013 Mar 15
1
Spearman rank correlation
Hi If I get a p-value less than 0.05 does that mean there is a significant relation between the 2 ranked lists? Sometimes I get a low correlation such as 0.3 or even 0.2 and the p-value is so low , such as 0.000001 , does that mean it is significant also? and would that be interpreted as significant low positive correlation or significant moderate positive correlation? Also,can R calculate the
2012 Jul 31
2
Btrfs Intermittent ENOSPC Issues
I''ve been working on running down intermittent ENOSPC issues. I can only seem to replicate ENOSPC errors when running zlib compression. However, I have been seeing similar ENOSPC errors to a lesser extent when playing with the LZ4HC patches. I apologize for not following up on this sooner, but I had drifted away from using zlib, and didn''t notice there was still an issue. My
2007 Sep 29
1
Shapiro-Welch W value interpretation
Hello, I have tested a distribution for normality using the Shapiro-Welch statistic. The result of this is the following: Shapiro-Wilk normality test data: mydata W = 0.9989, p-value = 0.8791 I know that the p-value > 0.05 (for my purposes) means that the data IS normally distributed but what I am not sure is with the W value, what values tell me that the data is normally
2014 Feb 23
1
Random Count Generation with rnbinom
The documentation states : An alternative parametrization (often used in ecology) is by the mean ?mu?, and ?size?, the dispersion parameter. However, this fails : > rnbinom(10, mu = 100, size = 0) [1] NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN Warning message: In rnbinom(10, mu = 100, size = 0) : NAs produced For dispersion set to 0, it should work like drawing from a Poisson distribution.
2009 Jul 20
2
HELP: BRUGS/WinBUGS/RBUGS Response is a combination of random variables
Hi, Is there anyone know if BUGS language allows the combination of variables as response such as Y[i] <- a*X1[i]+b*X2[i] Y[i] ~ dnorm(c,d) It seems doesn't work in my model. The problem is between two ######. The error message is > modelCheck("BayesBioMarker.BUGS") model is syntactically correct > modelData(paste("BUGS_data.txt",sep="")) data
2010 Jun 04
2
Help with iteration using while loop
Hello everyone, I am trying to use while loop to iterate a function until convergence. But I am having problem when I try to use a fixed number of iterations. Say I want to use maximum iteration of 150. If the value don't converge within maximum iteration, show warning of no convergence. Currently I don't have non- convergence problem so I think my code works fine. But in future I may