Displaying 20 results from an estimated 500 matches similar to: "Problems installing lme4 on Ubuntu"
2008 Jan 31
1
R2WinBUGS is broken
Dear R-users,
I am trying to use the following code to reproduce results from Prof.
Gelman's book, but have the listed error for R2WinBUGS version (the openbugs
version is good). I am using R-2.6.1 on windows XP, and all the R packages
are most current ones. schools.bug can be found at
http://www.stat.columbia.edu/~gelman/bugsR/runningbugs.html . Can anyone
help me to figure out what's
2005 May 16
1
A question about bugs.R: functions for running WinBUGs from R
Dear R users,
I've found bugs.R : the functions for running WinBUGs from R that is
writen by Dr. Andrew Gelman who is a professor from Columbia University.
The bugs.R would be very useful for me, and I think many of you know it
as well. I followed the instuctions on Dr. Gelman's web to install all
of documents that bugs.R needs, but when I try to run the school example
the web posted in
2006 Jan 16
3
Current state of support for BUGS access for Linux users?
Greetings:
I'm going to encourage some students to try Bayesian ideas for
hierarchical models.
I want to run the WinBUGS and R examples in Tony Lancaster's An
Introduction to Modern Bayesian Econometrics. That features MS
Windows and "bugs" from R2WinBUGS.
Today, I want to ask how people are doing this in Linux? I have found
a plethora of possibilities, some of which are not
2007 Aug 20
1
rv package, rvnorm function
In an attempt to learn to use the rv package, I have been working
through the examples in Jouni Kerman and Andrew Gelman's "Using Random
Variables to Manipulate and Summarize Simulations in R" (July 4, 2007).
I am using a Dell Precision 380n computer running Gentoo Linux and R
2.2.1 (the latest available through Gentoo's portage/emerge system).
Everything worked well until I
2009 Aug 17
1
Bayesian data analysis - help with sampler function
I have downloaded the Umacs (Universal Markov chain sampler) and submitted the following sample code from Kerman and Gelman.
s <-Sampler(
J=8,
sigma.y =c(15,10,16,11,9,11,10,18),
y =c(28, 8,-3,7,-1,1,18,12),
theta =Gibbs(theta.update,theta.init),
V =Gibbs(V.update,mu.init),
mu =Gibbs(mu.update,mu.init),
tau =Gibbs(tau.update,tau.init),
2003 Apr 18
1
MCMCpack gelman.plot and gelman.diag
Hi,
A question. When I run gelman.diag and gelman.plot
with mcmc lists obtained from MCMCregress, the results are following.
> post.R <- MCMCregress(Size~Age+Status, data = data, burnin = 5000, mcmc = 100000,
+ thin = 10, verbose = FALSE, beta.start = NA, sigma2.start = NA,
+ b0 = 0, B0 = 0, nu = 0.001, delta = 0.001)
> post1.R <- MCMCregress(Size~Age+Status, data
2006 May 02
2
evaluation of expressions
Hi, all. I'm trying to automate some regression operations in R but am
confused about how to evaluate expressoins that are expressed as
character strings. For example:
y <- ifelse (rnorm(10)>0, 1, 0)
sex <- rnorm(10)
age <- rnorm(10)
test <- as.data.frame (cbind (y, sex, age))
# this works fine:
glm (y ~ sex + I(age^2), data=test, family=binomial(link="logit"),
2007 Feb 11
2
problem with Matrix package
I decided to update my packages and then had a problem with loading the
Matrix package
http://cran.at.r-project.org/bin/windows/contrib/2.4/Matrix_0.9975-9.zip
This is what happened when I tried to load it in:
> library("Matrix")
Error in importIntoEnv(impenv, impnames, ns, impvars) :
object 'Logic' is not exported by 'namespace:methods'
Error:
2006 May 01
3
pulling items out of a lm() call
I want to write a function to standardize regression predictors, which
will require me to do some character-string manipulation to parse the
variables in a call to lm() or glm().
For example, consider the call
lm (y ~ female + I(age^2) + female:black + (age + education)*female).
I want to be able to parse this to pick out the input variables
("female", "age",
2011 Feb 24
2
MCMCpack combining chains
Deal all, as MCMClogit does not allow for the specification of several chains, I have run my model 3 times with different random number seeds and differently dispersed multivariate normal priors.
For example:
res1 = MCMClogit(y~x,b0=0,B0=0.001,data=mydat, burnin=500, mcmc=5500, seed=1234, thin=5)
res2 = MCMClogit(y~x,b0=1,B0=0.01,data=mydat, burnin=500, mcmc=5500, seed=5678, thin=5)
res3 =
2006 Feb 01
1
student-t regression in R?
Is there a quick way to fit student-t regressions (that is, a regression
with t-distributed error, ideally with the degrees-of-freedom parameter
estimated from the data)? I can do it easily enough in Bugs, or I can
program the log-likelihood in R and optimize using optim(), but an R
version (if it's already been written by somebody) would be convenient,
especially for teaching purposes.
2006 Jan 10
2
lmer(): nested and non-nested factors in logistic regression
Thanks to some help by Doug Bates (and the updated version of the Matrix
package), I've refined my question about fitting nested and non-nested
factors in lmer(). I can get it to work in linear regression but it
crashes in logistic regression. Here's my example:
# set up the predictors
n.age <- 4
n.edu <- 4
n.rep <- 100
n.state <- 50
n <- n.age*n.edu*n.rep
age.id
2006 May 20
5
Can lmer() fit a multilevel model embedded in a regression?
I would like to fit a hierarchical regression model from Witte et al.
(1994; see reference below). It's a logistic regression of a health
outcome on quntities of food intake; the linear predictor has the form,
X*beta + W*gamma,
where X is a matrix of consumption of 82 foods (i.e., the rows of X
represent people in the study, the columns represent different foods,
and X_ij is the amount of
2003 Aug 10
3
Support for Bayesian statistics in R
I'm just starting to learn to use R, and although I'm seeing lots of
functions aimed at doing orthodox statistical analyses, I don't see the
same for Bayesian analyses. What support does R have for Bayesian
statistics?
2007 Dec 03
1
difficulties getting coef() to work in some lmer() calls
I'm working with Andrew Gelman on a book project and we're having some
difficulties getting coef() to work in some lmer() calls.
Some versions of the model work and some do not. For example, this works
(in that we can run the model and do coef() from the output):
R2 <- lmer(y2 ~ factor(z.inc) + z.st.inc.full + z.st.rel.full + (1 + factor(
z.inc) | st.num),
2005 May 01
2
eigen() may fail for some symmetric matrices, affects mvrnorm()
Hi all,
Recently our statistics students noticed that their Gibbs samplers were
crashing due to some NaNs in some parameters. The NaNs came from
mvrnorm (Ripley & Venables' MASS package multivariate normal sampling
function) and with some more investigation it turned out that they were
generated by function eigen, the eigenvalue computing function. The
problem did not seem to happen
2006 Jan 10
1
another question about lmer, this time involving coef()
I'm having another problem with lmer(), this time something simpler (I
think) involving the coef() function for a model with varying
coefficients. Here's the R code. It's a simple model with 2
observations per group and 10 groups:
# set up the predictors
n.groups <- 10
n.reps <- 2
n <- n.groups*n.reps
group.id <- rep (1:n.groups, each=n.reps)
# simulate the varying
2008 Oct 01
3
Change color of plot points based on values of a variable
Dear R users:
I have run a logistic regression, used Gelman et al.'s car package to simulate the parameter estimates of that model, and have plotted the probability (using Gelman et al.'s invlogit() function) of the dependent variable being 1 given the value of a particular independent variable is at its mean. The plot has probabilities on the y-axis and the number (1-1000) of the
2006 Feb 10
1
mcmcsamp shortening variable names; how can i turn this feature off?
I have written a function called mcsamp() that is a wrapper that runs
mcmcsamp() and automatically monitors convergence and structures the
inferences into vectors and arrays as appropriate.
But I have run into a very little problem, which is that mcmcsamp()
shortens the variable names. For example:
> set.seed (1)
> group <- rep (1:5,10)
> a <- rnorm (5,-3,3)
> y <-
2006 Jan 28
1
yet another lmer question
I've been trying to keep track with lmer, and now I have a couple of
questions with the latest version of Matrix (0.995-4). I fit 2 very
similar models, and the results are severely rounded in one case and
rounded not at all in the other.
> y <- 1:10
> group <- rep (c(1,2), c(5,5))
> M1 <- lmer (y ~ 1 + (1 | group))
> coef(M1)
$group
(Intercept)
1 3.1
2