Displaying 20 results from an estimated 600 matches similar to: "difference between sm.density() and kde2d()"
2008 Oct 19
0
Kullback Leibler Divergence
Hi there,
I'm trying to find the KL divergence measure between a prior and it's
posterior distributions, and I'm using the KLdiv method in the flexmix
package. plese see the example below:
require(flexmix)
x=seq(-4,4,length=100)
d1=dnorm(x,0,1)
d2=dunif(x,-3,3)
y=cbind(d1,d2)
kl=KLdiv(y)
but let say,
x1=seq(-5,5,length=100)
d3=dunif(x1,-3,3)
y1=cbind(d1,d3)
kl1=KLdiv(y1)
Notice
2008 Sep 29
2
density estimate
Hi,
I have a vector or random variables and I'm estimating the density using
"bkde" function in the KernSmooth package. The out put contains two vectors
(x and y), and the R documentation calls y as the density estimates, but my
y-values are not exact density etstimates (since these are numbers larger
than 1)! what is y here? Is it possible to get the true estimated density at
each
2010 Jul 09
1
KLdiv produces NA. Why?
I am trying to calculate a Kullback-Leibler divergence from two
vectors with integers but get NA as a result when trying to calulate
the measure. Why?
x <- cbind(stuff$X, morestuff$X)
x[1:5,]
[,1] [,2]
[1,] 293 938
[2,] 293 942
[3,] 297 949
[4,] 290 956
[5,] 294 959
KLdiv(x)
[,1] [,2]
[1,] 0 NA
[2,] NA 0
Best,
Ralf
2010 Jul 15
1
Repeated analysis over groups / Splitting by group variable
I am performing some analysis over a large data frame and would like
to conduct repeated analysis over grouped-up subsets. How can I do
that?
Here some example code for clarification:
require("flexmix") # for Kullback-Leibler divergence
n <- 23
groups <- c(1,2,3)
mydata <- data.frame(
sequence=c(1:n),
data1=c(rnorm(n)),
data2=c(rnorm(n)),
group=rep(sample(groups, n,
2005 May 06
1
distance between distributions
Hi,
This is more of a general stat question. I am looking for a easily
computable measure of a distance between two empirical distributions.
Say I have two samples x and y drawn from X and Y. I want to compute a
statistics rho(x,y) which is zero if X = Y and grows as X and Y become
less similar.
Kullback-Leibler distance is the most "official" choice, however it
needs estimation of
2008 Jan 10
1
Entropy/KL-Distance question
Dear R-Users,
I have the CDF of a discrete probability distribution. I now observe a
change in this CDF at one point. I would like to find a new CDF such that
it has the shortest Kullback-Leibler Distance to the original CDF and
respects my new observation. Is there an existing package in R which will
let me do this ?
Google searches based on entropy revealed nothing.
Kind regards,
Tolga
2008 Sep 09
2
densities with overlapping area of 0.35
Hi,
I like to generate two normal densities such that the overlapping area
between them is 0.35. Is there any code/package available in R to do that??
Regards,
Lavan
--
View this message in context: http://www.nabble.com/densities-with-overlapping-area-of-0.35-tp19384741p19384741.html
Sent from the R help mailing list archive at Nabble.com.
2007 Aug 10
1
kde2d error message
Hello!
I am trying to do a smooth with the kde2d function, and I'm getting an error
message about NAs. Does anyone have any suggestions? Does this function
not do well with NAs in general?
fit <- kde2d(X, Y, n=100,lims=c(range(X),range(Y)))
Error in if (from == to || length.out < 2) by <- 1 :
missing value where TRUE/FALSE needed
Thanks in advance!!
Jen
[[alternative
2009 Dec 02
2
Joint density kde2d works improperly?
Dear all,
Please, look at the following code:
attach(geyser)
f1 <- kde2d(duration, waiting, n = 5)
a <- 0
for (i in 1:5){
for (j in 1:5){
a <- a + f1$z[i,j]
}
}
As far as I understood from Help kde2d returns matrix elements of which are
values of joint probability mass function Pr(X=x,Y=y) therefore, sum of its
elements should sum to 1.
Which is not the case from my check.
Where is
2011 Nov 24
2
Question on density values obtained from kde2d() from package MASS
Hello,
I am a little bit confused regarding the density values obtained from the function kde2d() from the package MASS because the are not in the intervall [0,1] as I would expect them to be. Here is an example:
x <- c(0.0036,0.0088,0.0042,0.0022,-0.0013,0.0007,0.0028,-0.0028,0.0019,0.0026,-0.0029,-0.0081,-0.0024,0.0090,0.0088,0.0038,0.0022,0.0068,0.0089,-0.0015,-0.0062,0.0066)
y <-
2007 Jun 07
0
How to get the number of modes using kde2d
Hi,
The silverman's paper introduction offer how to find a mode for one
dimensional data based
on software
http://www.stanford.edu/~kasparr/software/silverman.r,
for two dimensional data I use kde2d to smooth it out first, then I get a
matrix of densities for all the X(one dimension) cross Y(another
dimension).
I sort X and Y first before I pass the values to kde2d(x, y, c(hx, hy)),
the
2003 Mar 02
0
gss_0.8-2
A new version of gss, version 0.8-2, is on CRAN now. Numerous new
functionalities have been added since my last r-announce post.
An ssanova1 suite has been added since version 0.7-4. It implements
low-dimensional approximations of the smoothing spline ANOVA models
of the ssanova suite. ssanova1 scales much better than ssanova with
large sample sizes.
A gssanova1 suite is added for non
2003 Mar 02
0
gss_0.8-2
A new version of gss, version 0.8-2, is on CRAN now. Numerous new
functionalities have been added since my last r-announce post.
An ssanova1 suite has been added since version 0.7-4. It implements
low-dimensional approximations of the smoothing spline ANOVA models
of the ssanova suite. ssanova1 scales much better than ssanova with
large sample sizes.
A gssanova1 suite is added for non
2007 Jan 25
0
distribution overlap - how to quantify?
Dear R-Users,
my objective is to measure the overlap/divergence of two probability
density functions, p1(x) and p2(x). One could apply the chi-square test
or determine the potential mixture components and then compare the
respective means and sigmas. But I was rather looking for a simple
measure of similarity.
Therefore, I used the concept of 'intrinsic discrepancy' which is
defined as:
2006 Sep 28
0
AIC in R
Dear R users,
According Brockwell & Davis (1991, Section 9.3, p.304), the penalty term for
computing the AIC criteria is "p+q+1" in the context of a zero-mean
ARMA(p,q) time series model. They arrived at this criterion (with this
particular penalty term) estimating the Kullback-Leibler discrepancy index.
In practice, the user usually chooses the model whose estimated index is
2005 May 06
0
FW: distance between distributions
Sorry, forgot to send this to the list originally.
-----Original Message-----
From: Mike Waters [mailto:dr.mike at ntlworld.com]
Sent: 06 May 2005 18:40
To: 'Campbell'
Subject: RE: [R] distance between distributions
-----Original Message-----
From: r-help-bounces at stat.math.ethz.ch
[mailto:r-help-bounces at stat.math.ethz.ch] On Behalf Of Campbell
Sent: 06 May 2005 11:19
To:
2005 Jan 14
1
kde2d and borders
Hallo,
I want to use kde2d to visualize data on a sphere given in spherical
coordinates. Now the problem is, that "phi == 2*pi = 0", so in principal
I have to connect (in a graphical view) the left and right border of my
plot (and the bottom and top). Has anyone any idea how to do that ?
Thanks,
Manuel
--
-------------------------------------
Manuel Metz
Sternwarte der
2006 Jan 19
2
function kde2d
Good evening,
I am Marta Colombo, student at Milan's Politecnico.
Thank you very much for your kindness, this mailing list is really useful.
I am using the function kde2d for two-dimensional kernel density estimation and I'd like to know something more about this kind of density estimator. In particular I'd like to know: what bandwidth is used ?
Thank you in advance for your attention
2008 Oct 03
1
Point of intersection
Hi,
Let say I have a normal density X~n(0,1) and I have a line y=0.01x+0.07. the
following code generate the plots.
x=seq(-10,10,length=100)
plot(x,p1,type='n',ylab="Density",main="Overlap Measure",xaxt="n",yaxt="n")
pi=dnorm(x,0,1)
points(x,p1,type='l')
abline(0.07,0.01)
you can see that the curves intersects at 3 points. My question is
2006 Jul 15
0
Validate using boolean values
The Problem: how to do the validation. I have not written it
correctly!!!
Given two tables: sizes and prices;
with a relation: has_many ,belongs_to
a Price.unit_price for a given Size.meassure must be
marked ''true'' as the standard unit_price for that meassure.
There will be many other unit_prices for a given meassure, those must be
marked ''false''
table