similar to: deviance vs entropy

Displaying 20 results from an estimated 7000 matches similar to: "deviance vs entropy"

2004 Feb 03
1
Stereo mode settings
Hi everyone, I'd like a bit of information about the stereo mode settings in psych_44.h of libvorbis. Specifically, about the adj_stereo structure: typedef struct { int pre[PACKETBLOBS]; int post[PACKETBLOBS]; float kHz[PACKETBLOBS]; float lowpasskHz[PACKETBLOBS]; } adj_stereo; What I am attempting to do is bring lossless stereo coupling down to the lower quality levels in
2003 Oct 31
1
constrained nonlinear optimisation in R?
Hello. I have searched the archives but have not found anything. I need to solve a constrained optimisation problem for a nonlinear function (“maximum entropy formalism”). Specifically, Optimise: -1*SUM(p_ilog(p_i)) for a vector p_i of probabilities, conditional on a series of constraints of the form: SUM(T_i*p_i)=k_i for given values of T_i and k_i (these are constraints on
2006 Apr 23
1
lme: null deviance, deviance due to the random effects, residual deviance
A maybe trivial and stupid question: In the case of a lm or glm fit, it is quite informative (to me) to have a look to the null deviance and the residual deviance of a model. This is generally provided in the print method or the summary, eg: Null Deviance: 658.8 Residual Deviance: 507.3 and (a bit simpled minded) I like to think that the proportion of deviance 'explained' by the
2001 Dec 10
2
distributions w. skewness & kurtosis
Is there some reasonable way to generate random data from a distribution that has some degree of skewness and/or kurtosis, but would otherwise be normal? thanks, -------------- next part -------------- A non-text attachment was scrubbed... Name: greiff.vcf Type: text/x-vcard Size: 398 bytes Desc: Card for Warren R. Greiff Url :
2011 Nov 10
1
Sum of the deviance explained by each term in a gam model does not equal to the deviance explained by the full model.
Dear R users, I read your methods of extracting the variance explained by each predictor in different places. My question is: using the method you suggested, the sum of the deviance explained by all terms is not equal to the deviance explained by the full model. Could you tell me what caused such problem? > set.seed(0) > n<-400 > x1 <- runif(n, 0, 1) > ## to see problem
2006 May 08
2
On the speed of apply and alternatives?
Dear all, I have to handle a large matrix (1000 x 10001) where in the last column i have a value that all the preceding values in the same row has to be compared to. I have made the following code : # generate a (1000 x 10001) matrix, testm # generate statistics matrix 1000 x 4: qnt <- c(0.01, 0.05) cmp_fun <- function(x) { LAST <- length(x) smpls <- x[1:(LAST-1)] real
2002 Jul 12
2
Crosstabs in R
Before I reinvent the wheel, I have need for a relatively straightforward crosstabulation (2 x n) function. I know that R has table(), ftable(), xtabs(), and summary(xtabs()), but none of these produce a fully "tricked" out cross-tabulation with marginal totals, expected cell frequencies, and an array of statistics about the contingency table. Is there a more complete (something
2010 Jul 30
2
svydesign syntax and deviance!
Un texte encapsul? et encod? dans un jeu de caract?res inconnu a ?t? nettoy?... Nom : non disponible URL : <https://stat.ethz.ch/pipermail/r-help/attachments/20100731/ac3b9e43/attachment.pl>
2009 Nov 10
1
Calculating the percentage of explained deviance in lmer
Dear all, I am trying to calculate some measure of the amount of variability in the response variable that is explained by a model fitted in lmer m1<-lmer(response-var ~ Condition+(1|Site/Area/Transect),family="binomial") . I've seen from the literature that the precentage of explained deviance is a common measure. How can I calculate it? Thanks a lot for your help, I hope this
2006 Mar 20
1
discrete entropy is not rotation invariant?
Hello, suppose one is forming a probability p(x,y), where the x,y axes are somewhat accidental and rotation is possible. I'm thinking about whether the discrete entropy H(x,y) should change if the probability is rotated in the x,y plane. My current conclusion is that it _does_ change, at least if the entropy is estimated via bins. As a simple example, suppose the probability mass is
2012 Jan 13
1
deviance and variance - GAM models
Hi all, This is pretty basic but I am not an expert and I couldn't find anything in the forum or my statistics book about it. I was reading a paper and the authors were using both "explained deviance" and "explained variance" as synonyms. They were describing a GAM regression. Is that right? I performed an analysis in R to take a look to the output of GAM regression and I
2017 May 26
3
Low random entropy
I am use to low random entropy on my arm boards, not an intel. On my Lenovo x120e, cat /proc/sys/kernel/random/entropy_avail reports 3190 bits of entropy. On my armv7 with Centos7 I would get 130 unless I installed rng-tools and then I get ~1300. SSH into one and it drops back to 30! for a few minutes. Sigh. Anyway on my new Zotac nano ad12 with an AMD E-1800 duo core, I am seeing 180.
2013 Apr 03
3
Deviance in Zero inflated models
Dear list, I am running some zero inflated models and would like to know what the deviance of the models. Unlike running a normal GLM where the deviance is displayed in the summary all that is displayed in a summary of the zero inflated model is the log likelihood. I hope this isn't a read the manual question, and if it is I apologize for wasting your time, but if you could still send me a
2012 Sep 05
1
Sharing entropy across VMs
Short question: is it possible to share entropy across all VMs and how can this be done? I found http://vanheusden.com/entropybroker/ which seems to offer a way, but I cannot get it to work, syslog tells me "stack smashing detected". So if there are ways people are currently using I''d be interested to hear them. Currently I find that even generating a gpg key on a PV VM takes
2005 Jul 08
1
explained deviance in multinom
Hi: I'm working with multinomial models with library nnet, and I'm trying to get the explained deviance (pseudo R^2) of my models. I am assuming that: pseudo R^2= 1 - dev(model) / dev (null) where dev(model) is the deviance for the fitted model and dev(null) is the deviance for the null model (with the intercept only). library(nnet) full.model<- multinom(cbind(factor1,
2011 Apr 08
1
multinom() residual deviance
Running a binary logit model on the data df <- data.frame(y=sample(letters[1:3], 100, repl=T), x=rnorm(100)) reveals some residual deviance: summary(glm(y ~ ., data=df, family=binomial("logit"))) However, running a multinomial model on that data (multinom, nnet) reveals a residual deviance: summary(multinom(y ~ ., data=df)) On page 203, the MASS book says that "here the
2008 Jul 08
1
calculation of entropy in R???
i want to calculate shannon entropy which is H1,H2,H3....upto H7? if there is any function or any package in which i can find this entropy directly. do you have any information please share this and i will be very thankful to you. Regards, ++++++++++++++++++++++++++++++++++++++++++++++ MUHAMMAD FAISAL Department of Statistics and Decion Support system, University of
2009 May 27
1
Deviance explined in GAMM, library mgcv
Dear R-users, To obtain the percentage of deviance explained when fitting a gam model using the mgcv library is straightforward: summary(object.gam) $dev.expl or alternatively, using the deviance (deviance(object.gam)) of the null and the fitted models, and then using 1 minus the quotient of deviances. However, when a gamm (generalizad aditive mixed model) is fitted, the
2012 Jan 16
1
GAM without intercept reports a huge deviance
Hi all, I constructed a GAM model with a linear term and two smooth terms, all of them statistically significant but the intercept was not significant. The adjusted r2 of this model is 0.572 and the deviance 65.3. I decided to run the model again without intercept, so I used in R the following instruction: regression= gam(dependent~ +linear_independent +s(smooth_independent_1)
2000 Jul 28
3
log likelihood and deviance
I'm fitting glm models and the summary gives the deviance of the model . I would like to obtain the log likelihood How can I do ? Thanks -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not