Duncan Murdoch
2024-Dec-13 18:45 UTC
[R] Non linear optimization with nloptr package fail to produce true optimal result
You posted a version of this question on StackOverflow, and were given advice there that you ignored. nloptr() clearly indicates that it is quitting without reaching an optimum, but you are hiding that message. Don't do that. Duncan Murdoch On 2024-12-13 12:52 p.m., Daniel Lobo wrote:> library(nloptr) > > set.seed(1) > A <- 1.34 > B <- 0.5673 > C <- 6.356 > D <- -1.234 > x <- seq(0.5, 20, length.out = 500) > y <- A + B * x + C * x^2 + D * log(x) + runif(500, 0, 3) > > #Objective function > > X <- cbind(1, x, x^2, log(x)) > f <- function(theta) { > sum(abs(X %*% theta - y)) > } > > #Constraint > > eps <- 1e-4 > > hin <- function(theta) { > abs(sum(X %*% theta) - sum(y)) - 1e-3 + eps > } > > Hx <- function(theta) { > X[100, , drop = FALSE] %*% theta - (120 - eps) > } > > #Optimization with nloptr > > Sol = nloptr(rep(0, 4), f, eval_g_ineq = hin, eval_g_eq = Hx, opts > list("algorithm" = "NLOPT_LN_COBYLA", "xtol_rel" = 1.0e-8))$solution > # -0.2186159 -0.5032066 6.4458823 -0.4125948
Daniel Lobo
2024-Dec-13 18:55 UTC
[R] Non linear optimization with nloptr package fail to produce true optimal result
Hi Duncan, I take your advice. I posted here in search for a better answer to my problem as I could not get that there. My question is: 1. Why nloptr() is failing where other programs can continue with the same set of data, numbers, and constraints? 2. Is this enough ground to say that nloptr is inferior and user should not use this in complex problems? I wish to get a thoughtful answer to above as my working environment only has the nloptr package installed, and it is an isolated system due to security issues and installation of a new package requires lot lot of approvals and time consuming. BTW, if someone interested here is my original post https://stackoverflow.com/a/79271318/15910619 On Sat, 14 Dec 2024 at 00:15, Duncan Murdoch <murdoch.duncan at gmail.com> wrote:> > You posted a version of this question on StackOverflow, and were given > advice there that you ignored. > > nloptr() clearly indicates that it is quitting without reaching an > optimum, but you are hiding that message. Don't do that. > > Duncan Murdoch > > On 2024-12-13 12:52 p.m., Daniel Lobo wrote: > > library(nloptr) > > > > set.seed(1) > > A <- 1.34 > > B <- 0.5673 > > C <- 6.356 > > D <- -1.234 > > x <- seq(0.5, 20, length.out = 500) > > y <- A + B * x + C * x^2 + D * log(x) + runif(500, 0, 3) > > > > #Objective function > > > > X <- cbind(1, x, x^2, log(x)) > > f <- function(theta) { > > sum(abs(X %*% theta - y)) > > } > > > > #Constraint > > > > eps <- 1e-4 > > > > hin <- function(theta) { > > abs(sum(X %*% theta) - sum(y)) - 1e-3 + eps > > } > > > > Hx <- function(theta) { > > X[100, , drop = FALSE] %*% theta - (120 - eps) > > } > > > > #Optimization with nloptr > > > > Sol = nloptr(rep(0, 4), f, eval_g_ineq = hin, eval_g_eq = Hx, opts > > list("algorithm" = "NLOPT_LN_COBYLA", "xtol_rel" = 1.0e-8))$solution > > # -0.2186159 -0.5032066 6.4458823 -0.4125948 >
J C Nash
2024-Dec-13 19:30 UTC
[R] Non linear optimization with nloptr package fail to produce true optimal result
The following may or may not be relevant, but definitely getting somewhat different results. As this was a quick and dirty try while having a snack, it may have bugs. # Lobo2412.R -- from R Help 20241213 #Original artificial data library(optimx) library(nloptr) library(alabama) set.seed(1) A <- 1.34 B <- 0.5673 C <- 6.356 D <- -1.234 x <- seq(0.5, 20, length.out = 500) y <- A + B * x + C * x^2 + D * log(x) + runif(500, 0, 3) #Objective function X <- cbind(1, x, x^2, log(x)) flobo <- function(theta) { sum(abs(X %*% theta - y)) } #Constraint eps <- 1e-4 hinlobo <- function(theta) { abs(sum(X %*% theta) - sum(y)) - 1e-3 + eps # ?? weird! (1e-4 - 1e-3) } Hxlobo <- function(theta) { X[100, , drop = FALSE] %*% theta - (120 - eps) # ditto -- also constant } conobj<-function(tt){ ob <- flobo(tt) ci <- hinlobo(tt) if (ci > 0) {ci <- 0} ce <- Hxlobo(tt) si<-1; se<-1 val<-ob+si*ci^2+se*ce^2 cat("f, ci, ce,ob,val:"," ",ci," ",ce," ",ob," ",val," at "); print(tt) val } t0<-rep(0,4) conobj(t0) t1 <- c(2.02, 6.764, 6.186, -20.095) conobj(t1) t2 <- c( -0.2186159, -0.5032066, 6.4458823, -0.4125948) conobj(t2) solo<-optimr(t0, conobj, gr="grcentral", method="anms", control=list(trace=1)) solo conobj(solo$par) #Optimization with nloptr # Sol = nloptr::auglag(t0, flobo, eval_g_ineq = hinlobo, eval_g_eq = Hxlobo, opts # list("algorithm" = "NLOPT_LN_COBYLA", "xtol_rel" = 1.0e-8, print_level=1)) # -0.2186159 -0.5032066 6.4458823 -0.4125948 sol <- auglag(par=t0, fn=flobo, hin=hinlobo, heq=Hxlobo, control.outer=list(trace=TRUE)) sol #================================= J Nash On 2024-12-13 13:45, Duncan Murdoch wrote:> You posted a version of this question on StackOverflow, and were given advice there that you ignored. > > nloptr() clearly indicates that it is quitting without reaching an optimum, but you are hiding that message.? Don't do > that. > > Duncan Murdoch > > On 2024-12-13 12:52 p.m., Daniel Lobo wrote: >> library(nloptr) >> >> set.seed(1) >> A <- 1.34 >> B <- 0.5673 >> C <- 6.356 >> D <- -1.234 >> x <- seq(0.5, 20, length.out = 500) >> y <- A + B * x + C * x^2 + D * log(x) + runif(500, 0, 3) >> >> #Objective function >> >> X <- cbind(1, x, x^2, log(x)) >> f <- function(theta) { >> sum(abs(X %*% theta - y)) >> } >> >> #Constraint >> >> eps <- 1e-4 >> >> hin <- function(theta) { >> ?? abs(sum(X %*% theta) - sum(y)) - 1e-3 + eps >> } >> >> Hx <- function(theta) { >> ?? X[100, , drop = FALSE] %*% theta - (120 - eps) >> } >> >> #Optimization with nloptr >> >> Sol = nloptr(rep(0, 4), f, eval_g_ineq = hin, eval_g_eq = Hx, opts >> list("algorithm" = "NLOPT_LN_COBYLA", "xtol_rel" = 1.0e-8))$solution >> # -0.2186159 -0.5032066? 6.4458823 -0.4125948 > > ______________________________________________ > R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide https://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code.
John Fox
2024-Dec-13 19:44 UTC
[R] Non linear optimization with nloptr package fail to produce true optimal result
Dear Daniel et al., Following on Duncan's remark and examining the message produced by nloptr(), I simply tried increasing the maximum number of function evaluations: ------ snip ------- > nloptr(rep(0, 4), f, eval_g_ineq = hin, eval_g_eq = Hx, opts + list("algorithm" = "NLOPT_LN_COBYLA", "xtol_rel" = 1.0e-8, + maxeval = 1e5) + ) Call: nloptr(x0 = rep(0, 4), eval_f = f, eval_g_ineq = hin, eval_g_eq = Hx, opts = list(algorithm = "NLOPT_LN_COBYLA", xtol_rel = 1e-08, maxeval = 1e+05)) Minimization using NLopt version 2.7.1 NLopt solver status: 4 ( NLOPT_XTOL_REACHED: Optimization stopped because xtol_rel or xtol_abs (above) was reached. ) Number of Iterations....: 46317 Termination conditions: xtol_rel: 1e-08 maxeval: 1e+05 Number of inequality constraints: 1 Number of equality constraints: 1 Optimal value of objective function: 1287.71725107671 Optimal value of controls: 1.576708 6.456606 6.195305 -19.008 ---------- snip ---------- That produces a solution closer to, and better than, the one that you suggested (which you obtained how?): > f(c(0.222, 6.999, 6.17, -19.371)) [1] 1325.076 I hope this helps, John -- John Fox, Professor Emeritus McMaster University Hamilton, Ontario, Canada web: https://www.john-fox.ca/ -- On 2024-12-13 1:45 p.m., Duncan Murdoch wrote:> Caution: External email. > > > You posted a version of this question on StackOverflow, and were given > advice there that you ignored. > > nloptr() clearly indicates that it is quitting without reaching an > optimum, but you are hiding that message.? Don't do that. > > Duncan Murdoch > > On 2024-12-13 12:52 p.m., Daniel Lobo wrote: >> library(nloptr) >> >> set.seed(1) >> A <- 1.34 >> B <- 0.5673 >> C <- 6.356 >> D <- -1.234 >> x <- seq(0.5, 20, length.out = 500) >> y <- A + B * x + C * x^2 + D * log(x) + runif(500, 0, 3) >> >> #Objective function >> >> X <- cbind(1, x, x^2, log(x)) >> f <- function(theta) { >> sum(abs(X %*% theta - y)) >> } >> >> #Constraint >> >> eps <- 1e-4 >> >> hin <- function(theta) { >> ?? abs(sum(X %*% theta) - sum(y)) - 1e-3 + eps >> } >> >> Hx <- function(theta) { >> ?? X[100, , drop = FALSE] %*% theta - (120 - eps) >> } >> >> #Optimization with nloptr >> >> Sol = nloptr(rep(0, 4), f, eval_g_ineq = hin, eval_g_eq = Hx, opts >> list("algorithm" = "NLOPT_LN_COBYLA", "xtol_rel" = 1.0e-8))$solution >> # -0.2186159 -0.5032066? 6.4458823 -0.4125948 > > ______________________________________________ > R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide https://www.R-project.org/posting- > guide.html > and provide commented, minimal, self-contained, reproducible code.