Displaying 20 results from an estimated 900 matches similar to: "gelman.diag question"
A log on Bayesian statistics, stochastic cost frontier, montecarl o markov chains, bayesian P-values
2004 Feb 17
0
A log on Bayesian statistics, stochastic cost frontier, montecarl o markov chains, bayesian P-values
Dear friends,
Over the past weeks, I have been asking a lot of questions about how to
use R in Bayesian analysis. I am brand new to R, but I am very pleased with
it. I started with winbugs but I found winbugs to be a limited software, not
bad but has several limitations. By contrast, R allows the analyst to tackle
any problem with a huge set of tools for any kind of analysis. I love R. In
2004 Feb 12
1
How do you create a "MCMC" object?
I have been running a Gibbs Sampler to estimate levels of efficiency in the
Louisiana Shrimp Industry. I created a matrix (samp) where I stored the
results of each iteration for 86 variables. I run 10,000 iterations. So, the
matrix samp is 10,000 x 86. I want to use the gelman-rubin test to check for
convergence. To do that, I need at least two chains. If I run second chain
with different starting
2004 Mar 04
1
Gelman-Rubin Convergence test
Dear friends,
I run the Gelman-Rubin Convergence test for a MCMC object I have and I
got the following result Multivariate psrf 1.07+0i, What does this mean? I
guess (if I am not mistaken) that I should get a psrf close to 1.00 but what
is 1.07+0i? Is that convergence or something else?
Jorge
[[alternative HTML version deleted]]
2003 Apr 18
1
MCMCpack gelman.plot and gelman.diag
Hi,
A question. When I run gelman.diag and gelman.plot
with mcmc lists obtained from MCMCregress, the results are following.
> post.R <- MCMCregress(Size~Age+Status, data = data, burnin = 5000, mcmc = 100000,
+ thin = 10, verbose = FALSE, beta.start = NA, sigma2.start = NA,
+ b0 = 0, B0 = 0, nu = 0.001, delta = 0.001)
> post1.R <- MCMCregress(Size~Age+Status, data
2011 Mar 17
0
Gelman-Rubin convergence diagnostics via coda package
Dear,
I'm trying to run diagnostics on MCMC analysis (fitting a log-linear
model to rates data). I'm getting an error message when trying
Gelman-Rubin shrink factor plot:
>gelman.plot(out)
Error in chol.default(W) :
the leading minor of order 2 is not positive definite
I take it that somewhere, somehow a matrix is singular, but how can
that be remedied?
My code:
library(rjags)
2004 Feb 16
0
How do we obtain Posterior Predictive (Bayesian) P-values in R (a sking a second time)
Dear Friends,
According to Gelman et al (2003), "...Bayesian P-values are defined as
the probability that the replicated data could be more extreme than the
observed data, as measured by the test quantity p=pr[T(y_rep,tetha) >=
T(y,tetha)|y]..." where p=Bayesian P-value, T=test statistics, y_rep=data
from replicated experiment, y=data from original experiment, tetha=the
function
2012 Oct 03
0
calculating gelman diagnostic for mice object
I am using -mice- for multiple imputation and would like to use the gelman
diagnostic in -coda- to assess the convergence of my imputations. However,
gelman.diag requires an mcmc list as input. van Buuren and
Groothuis-Oudshoorn (2011) recommend running mice step-by-step to assess
convergence (e.g. imp2 <- mice.mids(imp1, maxit = 3, print = FALSE) ) but
this creates mids objects. How can I
2004 Feb 05
5
rgamma question
I was trying to generate random numbers with a gamma distribution. In R the
function is:
rgamma(n, shape, rate = 1, scale = 1/rate). My question is that if
X~gamma(alpha, beta) and I want to generate one random number where do I
plug alpha and beta in rgamma? and, what is the meaning and use of rate?
Thanks for your attention,
Jorge
[[alternative HTML version deleted]]
2010 May 28
3
Gelman 2006 half-Cauchy distribution
Hi,
I am trying to recreate the right graph on page 524 of Gelman's 2006
paper "Prior distributions for variance parameters in hierarchical
models" in Bayesian Analysis, 3, 515-533. I am only interested, however,
in recreating the portion of the graph for the overlain prior density
for the half-Cauchy with scale 25 and not the posterior distribution.
However, when I try:
2011 Feb 24
2
MCMCpack combining chains
Deal all, as MCMClogit does not allow for the specification of several chains, I have run my model 3 times with different random number seeds and differently dispersed multivariate normal priors.
For example:
res1 = MCMClogit(y~x,b0=0,B0=0.001,data=mydat, burnin=500, mcmc=5500, seed=1234, thin=5)
res2 = MCMClogit(y~x,b0=1,B0=0.01,data=mydat, burnin=500, mcmc=5500, seed=5678, thin=5)
res3 =
2007 Feb 11
2
problem with Matrix package
I decided to update my packages and then had a problem with loading the
Matrix package
http://cran.at.r-project.org/bin/windows/contrib/2.4/Matrix_0.9975-9.zip
This is what happened when I tried to load it in:
> library("Matrix")
Error in importIntoEnv(impenv, impnames, ns, impvars) :
object 'Logic' is not exported by 'namespace:methods'
Error:
2006 Feb 01
1
student-t regression in R?
Is there a quick way to fit student-t regressions (that is, a regression
with t-distributed error, ideally with the degrees-of-freedom parameter
estimated from the data)? I can do it easily enough in Bugs, or I can
program the log-likelihood in R and optimize using optim(), but an R
version (if it's already been written by somebody) would be convenient,
especially for teaching purposes.
2006 May 02
2
evaluation of expressions
Hi, all. I'm trying to automate some regression operations in R but am
confused about how to evaluate expressoins that are expressed as
character strings. For example:
y <- ifelse (rnorm(10)>0, 1, 0)
sex <- rnorm(10)
age <- rnorm(10)
test <- as.data.frame (cbind (y, sex, age))
# this works fine:
glm (y ~ sex + I(age^2), data=test, family=binomial(link="logit"),
2006 Jan 08
1
lmer with nested/nonnested groupings?
I'm trying to figure out how to use lmer to fit models with factors that
have some nesting and some non-nested groupings. For example, in this
paper:
http://www.stat.columbia.edu/~gelman/research/published/parkgelmanbafumi.pdf
we have a logistic regression of survey respondents' political
preferences (1=Republican, 0=Democrat), regressing on sex, ethnicity,
state (51 states within 5
2006 Jan 10
1
another question about lmer, this time involving coef()
I'm having another problem with lmer(), this time something simpler (I
think) involving the coef() function for a model with varying
coefficients. Here's the R code. It's a simple model with 2
observations per group and 10 groups:
# set up the predictors
n.groups <- 10
n.reps <- 2
n <- n.groups*n.reps
group.id <- rep (1:n.groups, each=n.reps)
# simulate the varying
2006 Feb 10
1
mcmcsamp shortening variable names; how can i turn this feature off?
I have written a function called mcsamp() that is a wrapper that runs
mcmcsamp() and automatically monitors convergence and structures the
inferences into vectors and arrays as appropriate.
But I have run into a very little problem, which is that mcmcsamp()
shortens the variable names. For example:
> set.seed (1)
> group <- rep (1:5,10)
> a <- rnorm (5,-3,3)
> y <-
2006 Jan 28
1
yet another lmer question
I've been trying to keep track with lmer, and now I have a couple of
questions with the latest version of Matrix (0.995-4). I fit 2 very
similar models, and the results are severely rounded in one case and
rounded not at all in the other.
> y <- 1:10
> group <- rep (c(1,2), c(5,5))
> M1 <- lmer (y ~ 1 + (1 | group))
> coef(M1)
$group
(Intercept)
1 3.1
2
2008 Dec 20
2
Problems installing lme4 on Ubuntu
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
While I'm not an R expert, I have used R on Windows XP. Now I've moved
to Ubuntu (Intrepid), and I'm trying to configure R to work with the
Gelman and Hill _Data Analysis Using Regression and
Multilevel/Hierarchical Models_. So far, it's not working.
I start by following the instructions for installing arm and BRugs at
2006 Jun 20
1
Bayesian logistic regression?
Hi all.
Are there any R functions around that do quick logistic regression with
a Gaussian prior distribution on the coefficients? I just want
posterior mode, not MCMC. (I'm using it as a step within an iterative
imputation algorithm.) This isn't hard to do: each step of a glm
iteration simply linearizes the derivative of the log-likelihood, and,
at this point, essentially no
2006 May 01
3
pulling items out of a lm() call
I want to write a function to standardize regression predictors, which
will require me to do some character-string manipulation to parse the
variables in a call to lm() or glm().
For example, consider the call
lm (y ~ female + I(age^2) + female:black + (age + education)*female).
I want to be able to parse this to pick out the input variables
("female", "age",