similar to: How to call R routines in C++`

Displaying 20 results from an estimated 5000 matches similar to: "How to call R routines in C++`"

2011 Feb 19
3
Kolmogorov-smirnov test
Is the kolmogorov-smirnov test valid on both continuous and discrete data? I don't think so, and the example below helped me understand why. A suggestion on testing the discrete data would be appreciated. Thanks, a <- rnorm(1000, 10, 1);a # normal distribution a b <- rnorm(1000, 12, 1.5);b # normal distribution b c <- rnorm(1000, 8, 1);c # normal distribution c d <- rnorm(1000,
2011 Apr 27
3
Kolmogorov-Smirnov test
Hi, I have a problem with Kolmogorov-Smirnov test fit. I try fit distribution to my data. Actualy I create two test: - # First Kolmogorov-Smirnov Tests fit - # Second Kolmogorov-Smirnov Tests fit see below. This two test return difrent result and i don't know which is properly. Which result is properly? The first test return lower D = 0.0234 and lower p-value = 0.00304. The lower 'D'
2006 Feb 03
2
Problems with ks.test
Hi everybody, while performing ks.test for a standard exponential distribution on samples of dimension 2500, generated everytime as new, i had this strange behaviour: >data<-rexp(2500,0.4) >ks.test(data,"pexp",0.4) One-sample Kolmogorov-Smirnov test data: data D = 0.0147, p-value = 0.6549 alternative hypothesis: two.sided >data<-rexp(2500,0.4)
2011 Jul 29
1
How to interpret Kolmogorov-Smirnov stats
Hi, Interpretation problem ! so what i did is by using the: >fit1 <- fitdist(vectNorm,"beta") Warning messages: 1: In dbeta(x, shape1, shape2, log) : NaNs produced 2: In dbeta(x, shape1, shape2, log) : NaNs produced 3: In dbeta(x, shape1, shape2, log) : NaNs produced 4: In dbeta(x, shape1, shape2, log) : NaNs produced 5: In dbeta(x, shape1, shape2, log) : NaNs produced 6: In
2009 Apr 29
2
Kolmogorov-Smirnov test
I got a distribution function and a empirical distribution function. How do I make to Kolmogorov-Smirnov test in R. Lets call the empirical distribution function >Fn on [0,1] and the distribution function >F on [0,1] ks.test( ) thanks for the help -- View this message in context: http://www.nabble.com/Kolmogorov-Smirnov-test-tp23296096p23296096.html Sent
2011 Jan 26
1
How to calculate p-value for Kolmogorov Smirnov test statistics?
Although I saw this issue being discussed many times before, I still did not find the answer to: why does R can not calculate p-values for data with ties (i.e. - sample with two or more values the same)? Can anyone elaborate some details about how does R calculate the p- values for the Kolmogorov Smirnov test statistics? I can understand the theoretical problem that continuous distributions do
2001 Jul 01
1
(PR#1007) ks.test doesn't compute correct empirical
On Sun, 1 Jul 2001 mcdowella@mcdowella.demon.co.uk wrote: > Full_Name: Andrew Grant McDowell > Version: R 1.1.1 (but source in 1.3.0 looks fishy as well) > OS: Windows 2K Professional (Consumer) > Submission from: (NULL) (194.222.243.209) Please upgrade: we've found a number of Win2k bugs and worked around them since then, let alone teh bug fixes and improvements in R .... >
2007 Oct 03
3
P-value
Hi, why don't you try try ks.test(VeriSeti1, VeriSeti2)$p.value All the best Jenny >How can i print only the P-Value of the kolmogorov smirnov test? > > >> ks.test(VeriSeti1, VeriSeti2) > > Two-sample Kolmogorov-Smirnov test > >data: VeriSeti1 and VeriSeti2 >D = 0.5, p-value = 0.4413 >alternative hypothesis: two-sided > > >This expression
2009 Oct 12
1
Kolmogorov smirnov test
Hi r-users,   I would like to use Kolmogorov smirnov test but in my observed data(xobs) there are ties.  I got the warning message.  My question is can I do something about it?   ks.test(xobs, xsyn)           Two-sample Kolmogorov-Smirnov test data:  xobs and xsyn D = 0.0502, p-value = 0.924 alternative hypothesis: two-sided Warning message: In ks.test(xobs, xsyn) : cannot compute correct
2005 Mar 18
1
Pb with ks.test pvalue
Hello, While doing test of normality under R and SAS, in order to prove the efficiency of R to my company, I notice that Anderson Darling, Cramer Van Mises and Shapiro-Wilk tests results are quite the same under the two environnements, but the Kolmogorov-smirnov p-value really is different. Here is what I do: > ks.test(w,pnorm,mean(w),sd(w)) One-sample Kolmogorov-Smirnov test data: w D
2011 Oct 06
2
KS test and theoretical distribution
> x <- runif(100) > y <- runif(100) > ks.test(x,y) Two-sample Kolmogorov-Smirnov test data: x and y D = 0.11, p-value = 0.5806 alternative hypothesis: two-sided ok I expected that, but: > ks.test(runif(100), "runif") One-sample Kolmogorov-Smirnov test data: runif(100) D = 0.9106, p-value < 2.2e-16 alternative hypothesis: two-sided How
2002 Jul 01
1
modified kolmogorov-smirnov
I'm trying to use modified Kolmogorov-Smirnov test with a Normal which I don't know it's parameters. Somebody told me about the lilifor function in R, but just can't find it. Does anybody know how I can test with the modified Kolmogorov-Smirnov test? Porqu? usar una base de datos relacional cualquiera, si pod?s usar PostgreSQL?
2004 Sep 09
1
kolmogorov-smirnov for discrete ordinal scale data
Hi, I was wondering whether there is an implementation of the Kolmogorov-Smirnov goodness of fit test for discrete, ordinal scale data in R - I've only managed to find the test for continuous data. Thanks! Gila
2007 Nov 06
2
Kolmogorov-Smirnoff test
I am trying to determine whether two samples are identical or not. I'm aware that somebody can use the Kolmogorov-Smirnoff test to compare empirical distributions, but since my samples have ties I'm not sure if I'm getting the right p-values for the comparison. Can the Kolmogorov-Smirnoff test be adjusted for the case when ties exists and are there any functions that already
2001 Jul 02
2
Shapiro-Wilk test
Hi, does the shapiro wilk test in R-1.3.0 work correctly? Maybe it does, but can anybody tell me why the following sample doesn't give "W = 1" and "p-value = 1": R> x<-1:9/10;x [1] 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 R> shapiro.test(qnorm(x)) Shapiro-Wilk normality test data: qnorm(x) W = 0.9925, p-value = 0.9986 I can't imagine a sample being
2007 Feb 23
4
using "integrate" in a function definition
Dear list members, I'm quite new to R, and though I tried to find the answer to my probably very basic question through the available resources (website, mailing list archives, docs, google), I've not found it. If I try to use the "integrate" function from within my own functions, my functions seem to misbehave in some contexts. The following example is a bit silly, but
2006 May 26
2
multiple comparisons of time series data
I am interested in a statistical comparison of multiple (5) time series' generated from modeling software (Hydrologic Simulation Program Fortran). The model output simulates daily bacteria concentration in a stream. The multiple time series' are a result of varying our representation of the stream within the model. Our main question is: Do the different methods used to represent a
2010 Jun 22
1
k-sample Kolmogorov-Smirnov test?
Hello, I am curious if anyone has had any success with finding a R version of a k-sample Kolmogorov-Smirnov test. Most of the references that I have able to find on this are fairly old and I am wondering if this type of analysis has fallen out of favour. If so, how do people tend to compare distributions when they have more than two? Is it reasonable to pursue an adjusted p-value method. That is,
2010 Aug 05
1
Kolmogorov-Smirnov test, which one to use?
Hi, I have two sets of data, an observed data and generated data. The generated data is obtained from the model where the parameters is estimated from the observed data. So I'm not sure which to use either one-sample test ks.test(x+2, "pgamma", 3, 2) # two-sided, exact or two-sample test ks.test(x, x2, alternative="l") If I use the one-sample test I need to
2003 May 15
2
kolmogorov-smirnov
Hello, I got a rather simple question: Can I find somewhere in R the significance values for a Kolmogorov distribution (I know the degrees of freedom and I have already the maximum deviation). ks.test is not really doing what I want. All I need is the values, like one can get the values for a chi-squared distribution by 'qchisq(0.05, 375)'. tnx, Kurt.