search for: kullback

Displaying 20 results from an estimated 24 matches for "kullback".

2008 Jan 24
2
Windows Vista password dialog keeps coming up
...d users and it doesn't work. The "images" and "bernardy_images" mounts work fine. No authentication problems. smb.conf content: # Samba config file created using SWAT # from 192.168.1.103 (192.168.1.103) # Date: 2008/01/24 15:29:49 [global] netbios aliases = bernardy, kullback, cva wins support = Yes valid users = im_user, bernardy [bernardy_images] comment = Bernardy images for viewing path = /data/images/image_viewing/bernardy username = bernardy valid users = bernardy, im_user, Robert read list = bernardy, Robert write list = Robert, im_user read only = No...
2008 Jan 25
0
VPN and NetBIOS aliases
...server's main NetBIOS name but I can't use any of the aliases. Whilst inside the network the aliases work fine.Any ideas on how to get around this?# Samba config file created using SWAT # from 192.168.1.200 (192.168.1.200) # Date: 2008/01/24 20:06:30 [global] netbios aliases = bernardy, kullback, cva client NTLMv2 auth = Yes client lanman auth = No client plaintext auth = No wins support = Yes valid users = im_user, bernardy, cva, kullback [bernardy_images] comment = Bernardy images for viewing path = /data/images/image_viewing/bernardy username = bernardy valid users = bernardy,...
2008 Oct 19
0
Kullback Leibler Divergence
...=KLdiv(y1) Notice that kl1 and kl are not the same. the documentation for the package doesn't mention to evaluate the densities at the same points. which one is correct? do I need to evaluate them at the same point or not? Thanks, Lavan -- View this message in context: http://www.nabble.com/Kullback-Leibler-Divergence-tp20056359p20056359.html Sent from the R help mailing list archive at Nabble.com.
2010 Aug 04
0
Kullback–Leibler divergence question (flexmix::KLdiv) Urgent!
Hi all, x <- cbind(rnorm(500),rnorm(500)) KLdiv(x, eps=1e-4) KLdiv(x, eps=1e-5) KLdiv(x, eps=1e-6) KLdiv(x, eps=1e-7) KLdiv(x, eps=1e-8) KLdiv(x, eps=1e-9) KLdiv(x, eps=1e-10) ... KLdiv(x, eps=1e-100) ... KLdiv(x, eps=1e-1000) When calling flexmix::KLdiv using the given code I get results with increasing value the smaller I pick the accuracy parameter 'eps' until finally reaching
2003 Mar 05
8
How to draw several plots in one figure?
Hey, I want to draw several plots sequently, but have to make them dispaly in one figure. So how to achieve this? Thanks. Fred
2010 Jul 09
1
KLdiv produces NA. Why?
I am trying to calculate a Kullback-Leibler divergence from two vectors with integers but get NA as a result when trying to calulate the measure. Why? x <- cbind(stuff$X, morestuff$X) x[1:5,] [,1] [,2] [1,] 293 938 [2,] 293 942 [3,] 297 949 [4,] 290 956 [5,] 294 959 KLdiv(x) [,1] [,2] [1,] 0 NA [2,]...
2010 Jul 15
1
Repeated analysis over groups / Splitting by group variable
I am performing some analysis over a large data frame and would like to conduct repeated analysis over grouped-up subsets. How can I do that? Here some example code for clarification: require("flexmix") # for Kullback-Leibler divergence n <- 23 groups <- c(1,2,3) mydata <- data.frame( sequence=c(1:n), data1=c(rnorm(n)), data2=c(rnorm(n)), group=rep(sample(groups, n, replace=TRUE)) ) # Part 1: full stats (works fine) dataOnly <- cbind(mydata$data1, mydata$data2, mydata$group) KLdiv(dataOnly) # #...
2005 May 06
1
distance between distributions
...i, This is more of a general stat question. I am looking for a easily computable measure of a distance between two empirical distributions. Say I have two samples x and y drawn from X and Y. I want to compute a statistics rho(x,y) which is zero if X = Y and grows as X and Y become less similar. Kullback-Leibler distance is the most "official" choice, however it needs estimation of the density. The estimation of the density requires one to choose a family of the distributions to fit from or to use some sort of non-parametric estimation. I have no intuition whether the resulting KL distanc...
2008 Jan 10
1
Entropy/KL-Distance question
Dear R-Users, I have the CDF of a discrete probability distribution. I now observe a change in this CDF at one point. I would like to find a new CDF such that it has the shortest Kullback-Leibler Distance to the original CDF and respects my new observation. Is there an existing package in R which will let me do this ? Google searches based on entropy revealed nothing. Kind regards, Tolga Generally, this communication is for informational purposes only and it is not intended as...
2008 May 09
2
Vista and "System error 53 has occurred"
...sers = bernardy, im_user read list = bernardy, im_user write list = im_user force user = bernardy read only = No [homes] comment = Home Directories invalid users = root, admin, bin, daemon, sys, adm, uucp, nuucp, smmsp, listen, gdm, webservd, rpollard, mysql valid users = im_user, bernardy, kullback, cva [images] comment = Image administrator access point path = /data/images username = im_user valid users = im_user read only = No [cva_images] comment = CVA image viewing directory path = /data/images/image_viewing/cva username = cva valid users = cva, im_user, Robert read list = im_...
2008 Sep 29
2
density estimate
Hi, I have a vector or random variables and I'm estimating the density using "bkde" function in the KernSmooth package. The out put contains two vectors (x and y), and the R documentation calls y as the density estimates, but my y-values are not exact density etstimates (since these are numbers larger than 1)! what is y here? Is it possible to get the true estimated density at each
2007 Aug 03
3
question about logistic models (AIC)
Een ingesloten tekst met niet-gespecificeerde tekenset is van het bericht gescrubt ... Naam: niet beschikbaar Url: https://stat.ethz.ch/pipermail/r-help/attachments/20070803/79b6292b/attachment.pl
2010 Jun 23
4
Comparing distributions
I am trying to do something in R and would appreciate a push into the right direction. I hope some of you experts can help. I have two distributions obtrained from 10000 datapoints each (about 10000 datapoints each, non-normal with multi-model shape (when eye-balling densities) but other then that I know little about its distribution). When plotting the two distributions together I can see that
2003 Mar 02
0
gss_0.8-2
...azard as "bivariate" smooth functions of time and covariates through penalized full likelihood. It only takes static covariates but accommodates interactions between time and covariates, going beyond the proportional hazard models. Utilities are provided for the calculation of a certain Kullback-Leibler projection of cross-validated fits to "reduced model" spaces, for the "testing" of model terms. Projection code is provide for ssanova1, gssanova1, ssden, and sshzd fits. Further details are to be found in the documentations and the examples therein. As always, featur...
2003 Mar 02
0
gss_0.8-2
...azard as "bivariate" smooth functions of time and covariates through penalized full likelihood. It only takes static covariates but accommodates interactions between time and covariates, going beyond the proportional hazard models. Utilities are provided for the calculation of a certain Kullback-Leibler projection of cross-validated fits to "reduced model" spaces, for the "testing" of model terms. Projection code is provide for ssanova1, gssanova1, ssden, and sshzd fits. Further details are to be found in the documentations and the examples therein. As always, featur...
2007 Jan 25
0
distribution overlap - how to quantify?
...\delta{p_{1},p_{2}} = min \left\{ \int_{\chi}p_{1}(x)\log \frac{p_{1}(x)}{p_{2}(x)}dx, \int_{\chi}p_{2}(x)\log\frac{p_{2}(x)}{p_{1}(x)}dx \right\} The smaller the delta the more similar are the distributions (0 when identical). I implemented this in 'R' using an adaptation of the Kullback-Leibler divergence. The function works, I get the expected results. The question is how to interpret the results. Obviously a delta of 0.5 reflects more similarity than a delta of 2.5. But how much more? Is there some kind of a statistical test for such an index (other than a simulation based eva...
2008 Oct 04
0
difference between sm.density() and kde2d()
Dear R users, I used sm.density function in the sm package and kde2d() in the MASS package to estimate the bivariate density. Then I calculated the Kullback leibler divergence meassure between a distribution and the each of the estimated densities, but the asnwers are different. Is there any difference between the kde2d and sm.density estimates? if there is a difference, then which is the best estimate? Thank you, lavan -- View this message in conte...
2009 Jul 03
1
is AIC always 100% in evaluating a model?
Hello, I'd like to say that it's clear when an independent variable can be ruled out generally speaking; on the other hand in R's AIC with bbmle, if one finds a better AIC value for a model without the given independent variable, versus the same model with, can we say that the independent variable is not likely to be significant(in the ordinary sense!)? That is, having made a lot of
2006 Sep 28
0
AIC in R
Dear R users, According Brockwell & Davis (1991, Section 9.3, p.304), the penalty term for computing the AIC criteria is "p+q+1" in the context of a zero-mean ARMA(p,q) time series model. They arrived at this criterion (with this particular penalty term) estimating the Kullback-Leibler discrepancy index. In practice, the user usually chooses the model whose estimated index is minimum. Consequently, it seems that the theory and the interpretation are only available in the case of a zero mean ARMA model, at least in the time series context. Concerning R, it seems that the...
2005 May 06
0
FW: distance between distributions
...i, This is more of a general stat question. I am looking for a easily computable measure of a distance between two empirical distributions. Say I have two samples x and y drawn from X and Y. I want to compute a statistics rho(x,y) which is zero if X = Y and grows as X and Y become less similar. Kullback-Leibler distance is the most "official" choice, however it needs estimation of the density. The estimation of the density requires one to choose a family of the distributions to fit from or to use some sort of non-parametric estimation. I have no intuition whether the resulting KL distanc...