similar to: ML optimization question--unidimensional unfolding scaling

Displaying 20 results from an estimated 5000 matches similar to: "ML optimization question--unidimensional unfolding scaling"

2005 Nov 03
1
ML optimization question--unidimensional unfolding scalin g
Alternatively, just type debug(optim) before using it, then step through it by hitting enter repeatedly... When you're done, do undebug(optim). Andy > From: Spencer Graves > > Have you looked at the code for "optim"? If you > execute "optim", it > will list the code. You can copy that into a script file and walk > through it line by line to
2012 May 21
1
simple, unidimensional heat map
I was wondering if someone could point in the direction of a package that could generate not heatmaps, but something like a unidimensional heat map. I might be mistaken, but it seems like image and heatmap are an overkill for such a simple task. For example, if I have a data frame: x<-data.frame(myname=paste("value",1:10,sep=""),a=1:10,b=sample(1:10,10,replace=T)) I'd
2005 Dec 22
2
Testing a linear hypothesis after maximum likelihood
I'd like to be able to test linear hypotheses after setting up and running a model using optim or perhaps nlm. One hypothesis I need to test are that the average of several coefficients is less than zero, so I don't believe I can use the likelihood ratio test. I can't seem to find a provision anywhere for testing linear combinations of coefficients after max. likelihood. Cheers
2003 Jul 13
3
How robust is mle in R?
A newbie question: I'm trying to decide whether to run a maximum likelihood estimation in R or Stata and am wondering if the R mle routine is reasonably robust. I'm fairly certain that, with this data, in Stata I would get a lot of complaints about non-concave functions and unproductive steps attempted, but would eventually have a successful ML estimate. I believe that, with the
2004 Oct 12
2
constrained optimization using nlm/optim?
I'm looking for an example of a simple R script that impliments a contrained nonlinear function using nlm or optim. I'm not exactly sure how to impliment the constraints within the objective function that is passed to nlm/optim. obj.func <- function( p ) { x(p) <- unconstrained obj function value if( constraint1 > something ) { obj.func <- x(p) } else {
2003 Jul 08
2
specifying multiple parameter starting values in nlm
Hi there I am having trouble figuring out how to get an nlm function to report estimates for two parameter values in an estimation. The way I've got it goes something like this: f <- function (q, r) { here, I have a second loop which uses q, r to give me values for c, d below. a and b are already specified; this loop is a mass-balance function where I am trying to find values of q,
2011 Sep 22
1
nlm's Hessian update method
Hi R-help! I'm trying to understand how R's nlm function updates its estimate of the Hessian matrix. The Dennis/Schnabel book cited in the references presents a number of different ways to do this, and seems to conclude that the positive-definite secant method (BFGS) works best in practice (p201). However, when I run my code through the optim function with the method as "BFGS",
2007 Sep 16
1
Problem with nlm() function.
In the course of revising a paper I have had occasion to attempt to maximize a rather complicated log likelihood using the function nlm(). This is at the demand of a referee who claims that this will work better than my proposed use of a home- grown implementation of the Levenberg-Marquardt algorithm. I have run into serious hiccups in attempting to apply nlm(). If I provide gradient and
2006 Sep 26
1
warning message in nlm
Dear R-users, I am trying to find the MLEs for a loglikelihood function (loglikcs39) and tried using both optim and nlm. fredcs39<-function(b1,b2,x){return(exp(b1+b2*x))} loglikcs39<-function(theta,len){ sum(mcs39[1:len]*fredcs39(theta[1],theta[2],c(8:(7+len))) - pcs39[1:len] * log(fredcs39(theta[1],theta[2],c(8:(7+len))))) } theta.start<-c(0.1,0.1) 1. The output from using optim is
2003 Oct 06
1
getting names of p vector in nlm function...
Dear R programming folks: I'm trying to finish off a package for non-linear simultaneous system estimation and I've been trying to figure out how to get the names of the parameter vector variables when inside the function that nlm calls to return the objective function value: knls <- function( theta, eqns, data, fitmethod="OLS", instr=NULL, S=NULL ) { ## print(
2003 Sep 30
1
can't get names from vector in nlm calls
I've been trying to figure out how to get the names of the parameter vector variables when inside the function that nlm calls to return the objective function value: knls <- function( theta, eqns, data, fitmethod="OLS", instr=NULL, S=NULL ) { ## print( names( theta ) ) # returns NULL ## get the values of the parameters for( i in 1:length( theta ) )
2005 Dec 04
1
Understanding nonlinear optimization and Rosenbrock's banana valley function?
GENERAL REFERENCE ON NONLINEAR OPTIMIZATION? What are your favorite references on nonlinear optimization? I like Bates and Watts (1988) Nonlinear Regression Analysis and Its Applications (Wiley), especially for its key insights regarding parameter effects vs. intrinsic curvature. Before I spent time and money on several of the refences cited on the help pages for "optim",
2006 Jan 20
2
big difference in estimate between dmvnorm and dnorm, how come?
Dear R community, I was trying to estimate density at point zero of a multivariate distribution (9 dimensions) and for this I was using a multinormal approximation and the function dmvnorm , gtools package. To have a sense of the error I tried to look the mismatch between a unidimensional version of my distribution and estimate density at point zero with function density, dmvnorm and dnorm. At
2015 Jun 07
2
[LLVMdev] Loop Unfolding in LLVM
Hello, I am looking for a loop unfolding procedure implemented in LLVM that helps to transform a while-loop to n-layer If-statements. The transformation should be on IR, although the example below is illustrated on the source level. original loop: * WHILE (condition) DO action ENDWHILE* Expected unfolded loop (2-layer): * IF (condition) THEN* * action* * IF
2005 Mar 08
4
Non-linear minimization
hello, I have got some trouble with R functions nlm(), nls() or optim() : I would like to fit 3 parameters which must stay in a precise interval. For exemple with nlm() : fn<-function(p) sum((dN-estdata(p[1],p[2],p[3]))^2) out<-nlm(fn, p=c(4, 17, 5), hessian=TRUE,print.level=2) with estdata() a function which returns value to fit with dN (observed data vactor) My problem is that only
2008 Aug 08
2
Suggestion for the optimization code
Dear list, Here's a suggestion about the different optimization code. There are several optimization procedures in the base package (optim, optimize, nlm, nlminb, ..). However, the output of these functions are slightly different. For instance, 1. optim returns a list with arguments par (the estimates), value the minimum (maxima) of the objective function, convergence (optim
2011 Jul 12
1
LOESS function Newton optimization
I have a question about running an optimization function on an existing LOESS function defined in R. I have a very large dataset (1 million observations) and have run a LOESS regression. Now, I want to run a Newton-Raphson optimization to determine the point at which the slope change is the greatest. I am relatively new to R and have tried several permutations of the maxNR and nlm functions with
2008 May 23
2
About Passing Arguments to Function
Hi, Below I have a function mlogl_k, later it's called with "nlm" . __BEGIN__ vsamples<- c(14.7, 18.8, 14, 15.9, 9.7, 12.8) mlogl_k <- function( k_func, x_func, theta_func, samp) { tot_mll <- 0 for (comp in 1:k_func) { curr_mll <- (- sum(dgamma(samp, shape = x_func, scale=theta_func, log = TRUE))) tot_mll <- tot_mll + curr_mll }
2010 Oct 22
2
Error message in using nlm() and optim()
I am facing a problem when trying to maximize the likelihood function. I am actually estimating a dynamic switching regression model using simulated likelihood approach. The likelihood function is estimated using Simulations and is extremely complex. It comprises of 16 parameters ,q1, q2, q3,.......q16. While attempting to maximize the likelihood function, using the functions nlm() and optim(),
2010 Jun 15
1
Error in nlm : non-finite value supplied by 'nlm'
Hello, I am trying to compute MLE for non-Gaussian AR(1). The error term follows a difference poisson distribution. This distribution has one parameter (vector[2]). So in total I want to estimate two parameters: the AR(1) paramter (vector[1]) and the distribution parameter. My function is the negative loglikelihood derived from a mixing operator. f=function(vector)