Just by chance I came across the following example of minimizing
a simple function
(x,y,z) --> 2 (x^2 - y z)
on the unit sphere, the only constraint present.
I tried it with two starting points, x1 = (1,0,0) and x2 = (0,0,1).
#-- Problem definition in R
f = function(x) 2 * (x[1]^2 - x[2]*x[3]) # (x,y,z) |-> 2(x^2 -yz)
g = function(x) c(4*x[1], 2*x[3], 2*x[2]) # its gradient
x0 = c(1, 0, 0); x1 = c(0, 0, 1) # starting points
xmin = c(0, 1/sqrt(2), 1/sqrt(2)) # true minimum -1
heq = function(x) 1-x[1]^2-x[2]^2-x[3]^2 # staying on the sphere
conf = function(x) { # constraint function
fun = x[1]^2 + x[2]^2 + x[3]^2 - 1
return(list(ceq = fun, c = NULL))
}
I tried all the nonlinear optimization solvers in R packages that
allow for equality constraints: 'auglag()' in alabama, 'solnl()'
in
NlcOptim, 'auglag()' in nloptr, 'solnp()' in Rsolnp, or even
'donlp2()'
from the Rdonlp2 package (on R-Forge).
None of them worked from both starting points:
# alabama
alabama::auglag(x0, fn = f, gr = g, heq = heq) # right (inaccurate)
alabama::auglag(x1, fn = f, gr = g, heq = heq) # wrong
# NlcOptim
NlcOptim::solnl(x0, objfun = f, confun = conf) # wrong
NlcOptim::solnl(x1, objfun = f, confun = conf) # right
# nloptr
nloptr::auglag(x0, fn = f, heq = heq) # wrong
# nloptr::auglag(x1, fn = f, heq = heq) # not returning
# Rsolnp
Rsolnp::solnp(x0, fun = f, eqfun = heq) # wrong
Rsolnp::solnp(x1, fun = f, eqfun = heq) # wrong
# Rdonlp2
Rdonlp2::donlp2(x0, fn = f, nlin = list(heq), # wrong
nlin.lower = 0, nlin.upper = 0)
Rdonlp2::donlp2(x1, fn = f, nlin = list(heq), # right
nlin.lower = 0, nlin.upper = 0) # (fast and exact)
The problem with starting point x0 appears to be that the gradient at
that point, projected onto the unit sphere, is zero. Only alabama is
able to handle this somehow.
I do not know what problem most solvers have with starting point x1.
The fact that Rdonlp2 is the fastest and most accurate is no surprise.
If anyone with more experience with one or more of these packages can
give a hint of what I made wrong, or how to change calling the solver
to make it run correctly, please let me know.
Thanks -- HW
Mark Leeds
2021-May-21 15:58 UTC
[R] Testing optimization solvers with equality constraints
Hi Hans: I think that you are missing minus signs in the 2nd and 3rd elements of your gradient. Also, I don't know how all of the optimixation functions work as far as their arguments but it's best to supply the gradient when possible. I hope it helps. On Fri, May 21, 2021 at 11:01 AM Hans W <hwborchers at gmail.com> wrote:> Just by chance I came across the following example of minimizing > a simple function > > (x,y,z) --> 2 (x^2 - y z) > > on the unit sphere, the only constraint present. > I tried it with two starting points, x1 = (1,0,0) and x2 = (0,0,1). > > #-- Problem definition in R > f = function(x) 2 * (x[1]^2 - x[2]*x[3]) # (x,y,z) |-> 2(x^2 -yz) > g = function(x) c(4*x[1], 2*x[3], 2*x[2]) # its gradient > > x0 = c(1, 0, 0); x1 = c(0, 0, 1) # starting points > xmin = c(0, 1/sqrt(2), 1/sqrt(2)) # true minimum -1 > > heq = function(x) 1-x[1]^2-x[2]^2-x[3]^2 # staying on the sphere > conf = function(x) { # constraint function > fun = x[1]^2 + x[2]^2 + x[3]^2 - 1 > return(list(ceq = fun, c = NULL)) > } > > I tried all the nonlinear optimization solvers in R packages that > allow for equality constraints: 'auglag()' in alabama, 'solnl()' in > NlcOptim, 'auglag()' in nloptr, 'solnp()' in Rsolnp, or even 'donlp2()' > from the Rdonlp2 package (on R-Forge). > > None of them worked from both starting points: > > # alabama > alabama::auglag(x0, fn = f, gr = g, heq = heq) # right (inaccurate) > alabama::auglag(x1, fn = f, gr = g, heq = heq) # wrong > > # NlcOptim > NlcOptim::solnl(x0, objfun = f, confun = conf) # wrong > NlcOptim::solnl(x1, objfun = f, confun = conf) # right > > # nloptr > nloptr::auglag(x0, fn = f, heq = heq) # wrong > # nloptr::auglag(x1, fn = f, heq = heq) # not returning > > # Rsolnp > Rsolnp::solnp(x0, fun = f, eqfun = heq) # wrong > Rsolnp::solnp(x1, fun = f, eqfun = heq) # wrong > > # Rdonlp2 > Rdonlp2::donlp2(x0, fn = f, nlin = list(heq), # wrong > nlin.lower = 0, nlin.upper = 0) > Rdonlp2::donlp2(x1, fn = f, nlin = list(heq), # right > nlin.lower = 0, nlin.upper = 0) # (fast and exact) > > The problem with starting point x0 appears to be that the gradient at > that point, projected onto the unit sphere, is zero. Only alabama is > able to handle this somehow. > > I do not know what problem most solvers have with starting point x1. > The fact that Rdonlp2 is the fastest and most accurate is no surprise. > > If anyone with more experience with one or more of these packages can > give a hint of what I made wrong, or how to change calling the solver > to make it run correctly, please let me know. > > Thanks -- HW > > ______________________________________________ > R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide > http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. >[[alternative HTML version deleted]]
Abby Spurdle
2021-May-22 07:42 UTC
[R] Testing optimization solvers with equality constraints
Sorry, this might sound like a poor question: But by "on the unit sphere", do you mean on the ***surface*** of the sphere? In which case, can't the surface of a sphere be projected onto a pair of circles? Where the cost function is reformulated as a function of two (rather than three) variables. On Sat, May 22, 2021 at 3:01 AM Hans W <hwborchers at gmail.com> wrote:> > Just by chance I came across the following example of minimizing > a simple function > > (x,y,z) --> 2 (x^2 - y z) > > on the unit sphere, the only constraint present. > I tried it with two starting points, x1 = (1,0,0) and x2 = (0,0,1). > > #-- Problem definition in R > f = function(x) 2 * (x[1]^2 - x[2]*x[3]) # (x,y,z) |-> 2(x^2 -yz) > g = function(x) c(4*x[1], 2*x[3], 2*x[2]) # its gradient > > x0 = c(1, 0, 0); x1 = c(0, 0, 1) # starting points > xmin = c(0, 1/sqrt(2), 1/sqrt(2)) # true minimum -1 > > heq = function(x) 1-x[1]^2-x[2]^2-x[3]^2 # staying on the sphere > conf = function(x) { # constraint function > fun = x[1]^2 + x[2]^2 + x[3]^2 - 1 > return(list(ceq = fun, c = NULL)) > } > > I tried all the nonlinear optimization solvers in R packages that > allow for equality constraints: 'auglag()' in alabama, 'solnl()' in > NlcOptim, 'auglag()' in nloptr, 'solnp()' in Rsolnp, or even 'donlp2()' > from the Rdonlp2 package (on R-Forge). > > None of them worked from both starting points: > > # alabama > alabama::auglag(x0, fn = f, gr = g, heq = heq) # right (inaccurate) > alabama::auglag(x1, fn = f, gr = g, heq = heq) # wrong > > # NlcOptim > NlcOptim::solnl(x0, objfun = f, confun = conf) # wrong > NlcOptim::solnl(x1, objfun = f, confun = conf) # right > > # nloptr > nloptr::auglag(x0, fn = f, heq = heq) # wrong > # nloptr::auglag(x1, fn = f, heq = heq) # not returning > > # Rsolnp > Rsolnp::solnp(x0, fun = f, eqfun = heq) # wrong > Rsolnp::solnp(x1, fun = f, eqfun = heq) # wrong > > # Rdonlp2 > Rdonlp2::donlp2(x0, fn = f, nlin = list(heq), # wrong > nlin.lower = 0, nlin.upper = 0) > Rdonlp2::donlp2(x1, fn = f, nlin = list(heq), # right > nlin.lower = 0, nlin.upper = 0) # (fast and exact) > > The problem with starting point x0 appears to be that the gradient at > that point, projected onto the unit sphere, is zero. Only alabama is > able to handle this somehow. > > I do not know what problem most solvers have with starting point x1. > The fact that Rdonlp2 is the fastest and most accurate is no surprise. > > If anyone with more experience with one or more of these packages can > give a hint of what I made wrong, or how to change calling the solver > to make it run correctly, please let me know. > > Thanks -- HW > > ______________________________________________ > R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code.
Gabor Grothendieck
2021-May-27 23:13 UTC
[R] Testing optimization solvers with equality constraints
In case it is of interest this problem can be solved with an
unconstrained optimizer,
here optim, like this:
proj <- function(x) x / sqrt(sum(x * x))
opt <- optim(c(0, 0, 1), function(x) f(proj(x)))
proj(opt$par)
## [1] 5.388907e-09 7.071068e-01 7.071068e-01
On Fri, May 21, 2021 at 11:01 AM Hans W <hwborchers at gmail.com>
wrote:>
> Just by chance I came across the following example of minimizing
> a simple function
>
> (x,y,z) --> 2 (x^2 - y z)
>
> on the unit sphere, the only constraint present.
> I tried it with two starting points, x1 = (1,0,0) and x2 = (0,0,1).
>
> #-- Problem definition in R
> f = function(x) 2 * (x[1]^2 - x[2]*x[3]) # (x,y,z) |-> 2(x^2 -yz)
> g = function(x) c(4*x[1], 2*x[3], 2*x[2]) # its gradient
>
> x0 = c(1, 0, 0); x1 = c(0, 0, 1) # starting points
> xmin = c(0, 1/sqrt(2), 1/sqrt(2)) # true minimum -1
>
> heq = function(x) 1-x[1]^2-x[2]^2-x[3]^2 # staying on the sphere
> conf = function(x) { # constraint function
> fun = x[1]^2 + x[2]^2 + x[3]^2 - 1
> return(list(ceq = fun, c = NULL))
> }
>
> I tried all the nonlinear optimization solvers in R packages that
> allow for equality constraints: 'auglag()' in alabama,
'solnl()' in
> NlcOptim, 'auglag()' in nloptr, 'solnp()' in Rsolnp, or
even 'donlp2()'
> from the Rdonlp2 package (on R-Forge).
>
> None of them worked from both starting points:
>
> # alabama
> alabama::auglag(x0, fn = f, gr = g, heq = heq) # right (inaccurate)
> alabama::auglag(x1, fn = f, gr = g, heq = heq) # wrong
>
> # NlcOptim
> NlcOptim::solnl(x0, objfun = f, confun = conf) # wrong
> NlcOptim::solnl(x1, objfun = f, confun = conf) # right
>
> # nloptr
> nloptr::auglag(x0, fn = f, heq = heq) # wrong
> # nloptr::auglag(x1, fn = f, heq = heq) # not returning
>
> # Rsolnp
> Rsolnp::solnp(x0, fun = f, eqfun = heq) # wrong
> Rsolnp::solnp(x1, fun = f, eqfun = heq) # wrong
>
> # Rdonlp2
> Rdonlp2::donlp2(x0, fn = f, nlin = list(heq), # wrong
> nlin.lower = 0, nlin.upper = 0)
> Rdonlp2::donlp2(x1, fn = f, nlin = list(heq), # right
> nlin.lower = 0, nlin.upper = 0) # (fast and exact)
>
> The problem with starting point x0 appears to be that the gradient at
> that point, projected onto the unit sphere, is zero. Only alabama is
> able to handle this somehow.
>
> I do not know what problem most solvers have with starting point x1.
> The fact that Rdonlp2 is the fastest and most accurate is no surprise.
>
> If anyone with more experience with one or more of these packages can
> give a hint of what I made wrong, or how to change calling the solver
> to make it run correctly, please let me know.
>
> Thanks -- HW
>
> ______________________________________________
> R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
--
Statistics & Software Consulting
GKX Group, GKX Associates Inc.
tel: 1-877-GKX-GROUP
email: ggrothendieck at gmail.com