Displaying 20 results from an estimated 10000 matches similar to: "(no subject)"
2008 Jun 26
0
geoR : Passing arguments to "optim" when using "likfit"]
Mzabalazo Ngwenya wrote:
> Hi everyone !
>
> I'm am trying to fit a kriging model to a set of data. When I just run
> the "likfit" command I can obtain the results. However when I try to
> pass additional arguements to the optimization function "optim" I get
> errors. That is I want to obtain the hessian matrix so matrix
> (hessian=TRUE).
>
2011 Sep 27
2
Error in optim function.
I'm trying to calculate the maximum likelihood estimate for a binomial
distribution. Here is my code:
y <- c(2, 4, 2, 4, 5, 3)
n <- length(y)
binomial.ll <- function (pi, y, n) { ## define log-likelihood
output <- y*log(pi)+(n-y)*(log(1-pi))
return(output)
}
binomial.mle <- optim(0.01, ## starting value
binomial.ll,
2019 Feb 19
1
mle (stat4) crashing due to singular Hessian in covariance matrix calculation
Hi, R developers.
when running mle inside a loop I found a nasty behavior. From time to
time, my model had a degenerate minimum and the loop just crashed. I
tracked it down to "vcov <- if (length(coef)) solve(oout$hessian)" line,
being the hessian singular.
Note that the minimum reached was good, it just did not make sense to
calculate the covariance matrix as the inverse of a
2008 Jun 26
0
geoR : Passing arguments to "optim" when using "likfit"
Hi everyone !
I'm am trying to fit a kriging model to a set of data. When I just run the "likfit" command I can obtain the results. However when I try to pass additional arguements to the optimization function "optim" I get errors. That is I want to obtain the hessian matrix so matrix (hessian=TRUE).
Heres a little example( 1-D). Can anyone shed some light? Where am I
2006 Mar 21
1
Hessian from optim()
Hello!
Looking on how people use optim to get MLE I also noticed that one can
use returned Hessian to get corresponding standard errors i.e. something
like
result <- optim(<< snip >>, hessian=T)
result$par # point estimates
vc <- solve(result$hessian) # var-cov matrix
se <- sqrt(diag(vc)) # standard errors
What is actually Hessian representing here?
2007 Aug 13
1
[Fwd: behavior of L-BFGS-B with trivial function triggers bug in stats4::mle]
I sent this in first on 30 July. Now that UseR! is over I'm trying again
(slightly extended version from last time).
With R 2.5.1 or R 2.6.0 (2007-08-04 r42421)
"L-BFGS-B" behaves differently from all of the
other optim() methods, which return the value of the function
when they are given a trivial function (i.e., one with no
variable arguments) to optimize. This is not
a bug in
2004 Apr 02
0
Hessian in constrOptim
Dear R-users,
In the function constrOptim there is an option to get an approximation
to the hessian of the surrogate function R at MLE by declaring
hessian=TRUE in the calls to the function optim. I would like to ask
if it is advisable to get an approximate hessian for the funcrion f as
follows:
f''(theta)=R''(theta|theta_k)-B''(theta)
where
2007 Jul 02
2
how to use mle with a defined function
Hi all,
I am trying to use mle() to find a self-defined function. Here is my
function:
test <- function(a=0.1, b=0.1, c=0.001, e=0.2){
# omega is the known covariance matrix, Y is the response vector, X is the
explanatory matrix
odet = unlist(determinant(omega))[1]
# do cholesky decomposition
C = chol(omega)
# transform data
U = t(C)%*%Y
WW=t(C)%*%X
beta = lm(U~W)$coef
Z=Y-X%*%beta
2011 Dec 17
0
time-varying parameters kalman filter estimation problem using FKF package
Dear R users,
I am trying to carry out MLE of the time-varying CAPM using the FKF package.
My approach so far has been to try and adapt the example given in the help
file found using ?fkf which demonstrates the MLE of an ARMA(2,1) model.
When I attempt to run my R code (given below) I get the following error:
Error in fkf(a0 = sp$a0, P0 = sp$P0, dt = sp$dt, ct = sp$ct, Tt = sp$Tt, :
Some of
2007 Apr 09
1
R:Maximum likelihood estimation using BHHH and BFGS
Dear R users,
I am new to R. I would like to find *maximum likelihood estimators for psi
and alpha* based on the following *log likelihood function*, c is
consumption data comprising 148 entries:
fn<-function(c,psi,alpha)
{
s1<-sum(for(i in 1:n){(c[i]-(psi^(-1/alpha)*(lag(c[i],-1))))^2*
(lag(c[i],-1)^((-2)*(alpha+1))
)});
s2<- sum(for(m in 1:n){log(lag(c[m],-1)^(((2)*alpha)+2))});
2007 Jul 13
1
Correlation matrix
I have a model with 5 parameters that I am optimising where the (best)
value of the objective function is negative. I would like to use the
Hessian matrix (from genoud and/or optim functions) to construct the
covariance and correlation matrices.
This is the code that I am using:
est <- out$par # Parameter estimates
H <- out$hessian # Hessian
V <-
2011 May 17
1
Problem with MLE
Hi there,
I am trying to run the following code:
> dcOU<-function(x,t,x0,theta,log=FALSE){
+ Ex<-theta[1]/theta[2]+(x0-theta[1]/theta[2])*exp(-theta[2]*t)
+ Vx<-theta[3]^2*(1-exp(-2*theta[2]*t))/(2*theta[2])
+ dnorm(x,mean=Ex,sd=sqrt(Vx),log=log)
+ }
> OU.lik<-function(theta1,theta2,theta3){
+ n<-length(X)
+ dt<-deltat(X)
+
2011 Nov 30
1
How can I pick a matrix from a function? (Out Product of Gradient)
Hi all,
I would like to use optim() to estimate the equation by the log-likelihood
function and gradient function which I had written. I try to use OPG(Out
Product of Gradient) to calculate the Hessian matrix since sometime Hessian
matrix is difficult to calculate. Thus I want to pick the Gradient matrix
from the gradient function.
Moreover, could R show the process of calculation on gradient
2010 Oct 14
1
robust standard errors for panel data - corrigendum
Hello again Max. A correction to my response from yesterday. Things were better than they seemed.
I thought it over, checked Arellano's panel book and Driscoll and Kraay (Rev. Econ. Stud. 1998) and finally realized that vcovSCC does what you want: in fact, despite being born primarily for dealing with cross-sectional correlation, 'SCC' standard errors are robust to "both
2012 Nov 15
1
hessian fails for box-constrained problems when close to boundary?
Hi
I am trying to recover the hessian of a problem optimised with
box-constraints. The problem is that in some cases, my estimates are very
close to the boundary, which will make optim(..., hessian=TRUE) or
optimHessian() fail, as they do not follow the box-constraints, and hence
estimate the function in the unfeasible parameter space.
As a simple example (my problem is more complex though,
2009 Apr 21
4
My surprising experience in trying out REvolution's R
I care a lot about R's speed. So I decided to give REvolution's R
(http://revolution-computing.com/) a try, which bills itself as an
optimized R. Note that I used the free version.
My machine is a Intel core 2 duo under Windows XP professional. The code
I run is in the end of this post.
First, the regular R 1.9. It takes 2 minutes and 6 seconds, CPU usage
50%
Next, REvolution's R.
2006 Jan 05
2
Wald tests and Huberized variances (was: A comment about R:)
On Wed, 4 Jan 2006, Peter Muhlberger wrote:
One comment in advance: please use a more meaningful subject. I would have
missed this mail if a colleague hadn't pointed me to it.
> I'm someone who from time to time comes to R to do applied stats for social
> science research.
[snip]
> I would also prefer not to have to work through a
> couple books on R or S+ to learn how to
2007 May 29
1
Estimate Fisher Information by Hessian from OPTIM
Dear All,
I am trying to find MLE by using "OPTIM" function.
Difficult in differentiating some parameter in my objective function, I
would like to use the returned hessian matrix to yield an estimate of
Fisher's Information matrix.
My question: Since the hessian is calculated by numerical differentiate, is
it a reliable estimate? Otherwise I would have to do a lot of work to
2011 Jun 14
1
Using MLE Method to Estimate Regression Coefficients
Good Afternoon,
I am relatively new to R and have been trying to figure out how to estimate regression coefficients using the MLE method. Some background: I am trying to examine scenarios in which certain estimators might be preferred to others, starting with MLE. I understand that MLE will (should) produce the same results as Ordinary Least Squares if the assumption of normality holds. That
2011 Jul 20
0
The C function getQ0 returns a non-positive covariance matrix and causes errors in arima()
Hi,
the function makeARIMA(), designed to construct some state space
representation of an ARIMA model, uses a C function called getQ0,
which can be found at the end of arima.c in R source files (library
stats). getQ0 takes two arguments, phi and theta, and returns the
covariance matrix of the state prediction error at time zero. The
reference for getQ0 (cited by help(arima)) is: