similar to: autologistic regression with Gibbs sampler

Displaying 20 results from an estimated 1000 matches similar to: "autologistic regression with Gibbs sampler"

2004 Jul 16
0
for loops in Gibbs sampler
Dear all: I am using R to do multiple imputation for longitudinal data set. The Gibbs chain basically requires draw posterior distribution of model parameters, including the random effects. The multiple imputation requires several independent Gibbs chains. So my program structure is like: for (chain in 1:5) { # perform Gibbs sampling... for (row in 1:row.no) { b.row=some function # draw
2011 Nov 10
1
Gibbs sampler
I have the following code, gibbs <-function(m,theta = 0.25, lambda =0.55, n =1){ alpha <- 1.5 beta <- 1.5 gamma <- 1.5 x<- array(0,c(m+1, 3)) x[1,1] <- theta x[1,2] <- lambda x[1,3]<- n for(t in 2:(m+1)){ x[t,1] <- rbinom(1, x[t-1,3], x[t-1,1]) x[t,2]<-rbeta(1, x[t-1,1] + alpha, x[t-1,3] - x[t-1,1] + beta) x[t,3]
2009 Jan 25
1
Gibbs sampler...did it work?
I am writing a Gibbs sampler. I think it is outputting some of what I want, in that I am getting vector of several thousand values (but not 10,000) in a txt file at the end. My question is, is the error message (see below) telling me that it can't output 10,000 values (draws) because of a limitation in my memory, file size, shape etc, or that there is an error in the sampler itself? >
2008 Dec 18
4
autologistic modelling in R
Hi, I have spatially autocorrelated data (with a binary response variable and continuous predictor variables). I believe I need to do an autologistic model, does anyone know a method for doing this in R? Many thanks C Bell
2006 Jun 26
1
Griddy-Gibbs sampler
Hey everyone, I have read the paper by Ritter and Tanner(1992) on Griddy-Gibbs sampler and I am trying to implement it in R without much luck. I was wondering if anyone had used this or could point me to any example code. Thanks, Liz --------------------------------- [[alternative HTML version deleted]]
2017 Nov 15
1
Autologistic regression in R
Hi, I am new to autologistic regression and R. I do have questions when starting a project in which I believe autologistic regression is needed. I have a point layer whose attribute table stores the values of the dependent variable and all the independent variables. I hope to to fit an autologistic model to analyze which factors or combinations of factors have effects on the presence/absence of
2006 Sep 30
1
autologistic model? - what package?
Dear all, Could you pleas advise me on the following? I need to use general(ized) linear models (binomial distribution + logit link function) , to describe the preferred environment of each species (each sample is an individual in which I have measured several variables and also recorded the species it belongs to) However, must account for the spatial autrefoocorrelation between
2009 Aug 17
1
Bayesian data analysis - help with sampler function
I have downloaded the Umacs (Universal Markov chain sampler) and submitted the following sample code from Kerman and Gelman.   s <-Sampler( J=8, sigma.y  =c(15,10,16,11,9,11,10,18),           y  =c(28, 8,-3,7,-1,1,18,12),      theta =Gibbs(theta.update,theta.init),           V =Gibbs(V.update,mu.init),         mu =Gibbs(mu.update,mu.init),         tau =Gibbs(tau.update,tau.init),       
2008 Mar 26
0
Naive Gibbs Sampling with Metropolis Steps (pkg: gibbs.met)
Hi R Users: This package provides two generic functions for performing Markov chain sampling in a naive way for a user-defined target distribution, which involves only continuous variables. The function "gibbs_met" performs Gibbs sampling with each 1-dimensional distribution sampled with Metropolis update using Gaussian proposal distribution centered at the previous state. The function
2007 Dec 04
1
Metropolis-Hastings within Gibbs coding error
Dear list, After running for a while, it crashes and gives the following error message: can anybody suggest how to deal with this? Error in if (ratio0[i] < log(runif(1))) { : missing value where TRUE/FALSE needed ################### original program ######## p2 <- function (Nsim=1000){ x<- c(0.301,0,-0.301,-0.602,-0.903,-1.208, -1.309,-1.807,-2.108,-2.71) # logdose
2000 Dec 15
0
Gibbs sampling in GLMMs: Beta testers required
Sort of a warning before I start: This post may be considered to describe a rather amateurish approach to distributing software which may annoy some people, but I sincerely hope it doesn't. I've been working for some years with David Clayton on a project which started life as an S package but has now turned into an R library. It is (now) called GLMMGibbs and estimates the parameters of
2004 Nov 18
1
gibbs sampling for mixture of normals
hi i'm looking for a gibbs sampling algorithm for R for the case of mixture of K normals, and in particular for the case of bivariate normals. i'd be grateful if anyone could send its own R-routine, at least for the univariate case. thank you in advance matteo
2013 Feb 02
1
repeating autocovariate functions
Hi there, Just wondering why my post was rejected? cheersRachel Subject: repeating autocovariate functions From: r-help-owner@r-project.org To: moyble@hotmail.com Date: Sat, 2 Feb 2013 02:56:27 +0100 Message rejected by filter rule match --Forwarded Message Attachment-- Date: Fri, 1 Feb 2013 17:56:14 -0800 From: moyble@hotmail.com To: r-help@r-project.org Subject: repeating autocovariate
2011 Apr 05
1
Gibbs sampling
An embedded and charset-unspecified text was scrubbed... Name: n?o dispon?vel URL: <https://stat.ethz.ch/pipermail/r-help/attachments/20110405/06486a8a/attachment.pl>
2010 Mar 16
0
tmvtnorm: version 1.0-2
Dear R users, the tmvtnorm package, the package for the truncated multivariate normal and Student-t distribution, has been updated on CRAN. The major changes in version 1.0-2 (2010-03-04) are: * The package now provides methods for the truncated multivariate Student-t distribution, i.e. random number generation, density function, distribution functions like rtmvt(), dtmvt() und ptmvt() and
2010 Mar 16
0
tmvtnorm: version 1.0-2
Dear R users, the tmvtnorm package, the package for the truncated multivariate normal and Student-t distribution, has been updated on CRAN. The major changes in version 1.0-2 (2010-03-04) are: * The package now provides methods for the truncated multivariate Student-t distribution, i.e. random number generation, density function, distribution functions like rtmvt(), dtmvt() und ptmvt() and
2005 Jul 19
1
initial points for arms in package HI
Dear R-users I have a problem choosing initial points for the function arms() in the package HI I intend to implement a Gibbs sampler and one of my conditional distributions is nonstandard and not logconcave. Therefore I'd like to use arms. But there seem to be a strong influence of the initial point y.start. To show the effect I constructed a demonstration example. It is reproducible
2006 Aug 11
2
about MCMC pack again...
Hello, thank you very much for your previous answers about the C++ code. I am interested in the application of the Gibbs Sampler in the IRT models, so in the function MCMCirt1d and MCMCirtkd. I've found the C++ source codes, as you suggested, but I cannot find anything about the Gibbs Sampler. All the files are for the Metropolis algorithm. Maybe I am not able to read them very well, by the
2008 Nov 01
2
sampling from Laplace-Normal
Hi, I have to draw samples from an asymmetric-Laplace-Normal distribution: f(u|y, x, beta, phi, sigma, tau) \propto exp( - sum( ( abs(lo) + (2*tau-1)*lo )/(2*sigma) ) - 0.5/phi*u^2), where lo = (y - x*beta) and y=(y_1, ..., y_n), x=(x_1, ..., x_n) -- sorry for this huge formula -- A WinBUGS Gibbs sampler and the HI package arms sampler were used with the same initial data for all parameters. I
2012 Aug 30
2
Which BUGS should one use?
Hello ALL! Some times ago I started to learn and play with Bayesian stuffs. Many advice use of WinBUGS for Bayesian inference Using Gibbs Sampler. However, WinBUGS is discontinued, and now, development is under OpenBUGS. I wasn't lazy, so I installed both and tried out. In more than 90% of cases they give comparable outcome. But in few cases I got substantial differences. Recently, I read nice