The Richards' curve is analytic, so nlsr::nlxb() should work better than nls() for getting derivatives -- the dreaded "singular gradient" error will likely stop nls(). Also likely, since even a 3-parameter logistic can suffer from it (my long-standing Hobbs weed infestation problem below), is that the Jacobian will be near-singular. And badly scaled. Nonlinear fitting problems essentially have different scale in different portions of the parameter space. You may also want to "fix" or mask one or more parameters to reduce the dimensionality of the problem, and nlsr::nlxb() can do that. The Hobbs problem has the following 12 data values for time points 1:12 # Data for Hobbs problem ydat <- c(5.308, 7.24, 9.638, 12.866, 17.069, 23.192, 31.443, 38.558, 50.156, 62.948, 75.995, 91.972) # for testing tdat <- seq_along(ydat) # for testing An unscaled model is eunsc <- y ~ b1/(1+b2*exp(-b3*tt)) This problem looks simple, but has given lots of software grief over nearly 5 decades. In 1974 an extensive search had all commonly available software failing, which led to the code that evolved into nlsr, though there are plenty of cases where really awful code will luckily find a good solution. The issue is getting a solution and knowing it is reasonable. I suspect a Richards' model will be more difficult unless the OP has a lot of data and maybe some external information to fix or constrain some parameters. JN On 2020-05-13 5:41 a.m., Peter Dalgaard wrote:> Shouldn't be hard to set up with nls(). (I kind of suspect that the Richards curve has more flexibility than data can resolve, especially the subset (Q,B,nu) seems highly related, but hey, it's your data...) > > -pd > >> On 13 May 2020, at 11:26 , Christofer Bogaso <bogaso.christofer at gmail.com> wrote: >> >> Hi, >> >> Is there any R package to fit Richards' curve in the form of >> https://en.wikipedia.org/wiki/Generalised_logistic_function >> >> I found there is one package grofit, but currently defunct. >> >> Any pointer appreciated. >> >> ______________________________________________ >> R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see >> https://stat.ethz.ch/mailman/listinfo/r-help >> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html >> and provide commented, minimal, self-contained, reproducible code. >
John, have you ever looked at interval optimization as an alternative since it can lead to provably global minima? Bernard Sent from my iPhone so please excuse the spelling!"> On May 13, 2020, at 8:42 AM, J C Nash <profjcnash at gmail.com> wrote: > > ?The Richards' curve is analytic, so nlsr::nlxb() should work better than nls() for getting derivatives -- > the dreaded "singular gradient" error will likely stop nls(). Also likely, since even a 3-parameter > logistic can suffer from it (my long-standing Hobbs weed infestation problem below), is > that the Jacobian will be near-singular. And badly scaled. Nonlinear fitting problems essentially > have different scale in different portions of the parameter space. > > You may also want to "fix" or mask one or more parameters to reduce the dimensionality of the problem, > and nlsr::nlxb() can do that. > > The Hobbs problem has the following 12 data values for time points 1:12 > > # Data for Hobbs problem > ydat <- c(5.308, 7.24, 9.638, 12.866, 17.069, 23.192, 31.443, > 38.558, 50.156, 62.948, 75.995, 91.972) # for testing > tdat <- seq_along(ydat) # for testing > > An unscaled model is > > eunsc <- y ~ b1/(1+b2*exp(-b3*tt)) > > This problem looks simple, but has given lots of software grief over nearly 5 decades. In 1974 an > extensive search had all commonly available software failing, which led to the code that evolved > into nlsr, though there are plenty of cases where really awful code will luckily find a good > solution. The issue is getting a solution and knowing it is reasonable. I suspect a Richards' > model will be more difficult unless the OP has a lot of data and maybe some external information > to fix or constrain some parameters. > > JN > > >> On 2020-05-13 5:41 a.m., Peter Dalgaard wrote: >> Shouldn't be hard to set up with nls(). (I kind of suspect that the Richards curve has more flexibility than data can resolve, especially the subset (Q,B,nu) seems highly related, but hey, it's your data...) >> >> -pd >> >>>> On 13 May 2020, at 11:26 , Christofer Bogaso <bogaso.christofer at gmail.com> wrote: >>> >>> Hi, >>> >>> Is there any R package to fit Richards' curve in the form of >>> https://en.wikipedia.org/wiki/Generalised_logistic_function >>> >>> I found there is one package grofit, but currently defunct. >>> >>> Any pointer appreciated. >>> >>> ______________________________________________ >>> R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see >>> https://stat.ethz.ch/mailman/listinfo/r-help >>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html >>> and provide commented, minimal, self-contained, reproducible code. >> > > ______________________________________________ > R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code.
Many moons ago (I think early 80s) I looked at some of the global optimizers, including several based on intervals. For problems of this size, your suggestion makes a lot of sense, though it has been so long since I looked at those techniques that I will avoid detailed comment. I've not looked to see if there are any such solvers for R, but would be happy to learn (probably best off-list). Also I'm willing to work at a modest pace on developing one. A starting point might be nls2 package. Best, JN On 2020-05-13 11:05 a.m., Bernard Comcast wrote:> John, have you ever looked at interval optimization as an alternative since it can lead to provably global minima? > > Bernard > Sent from my iPhone so please excuse the spelling!" > >> On May 13, 2020, at 8:42 AM, J C Nash <profjcnash at gmail.com> wrote: >> >> ?The Richards' curve is analytic, so nlsr::nlxb() should work better than nls() for getting derivatives -- >> the dreaded "singular gradient" error will likely stop nls(). Also likely, since even a 3-parameter >> logistic can suffer from it (my long-standing Hobbs weed infestation problem below), is >> that the Jacobian will be near-singular. And badly scaled. Nonlinear fitting problems essentially >> have different scale in different portions of the parameter space. >> >> You may also want to "fix" or mask one or more parameters to reduce the dimensionality of the problem, >> and nlsr::nlxb() can do that. >> >> The Hobbs problem has the following 12 data values for time points 1:12 >> >> # Data for Hobbs problem >> ydat <- c(5.308, 7.24, 9.638, 12.866, 17.069, 23.192, 31.443, >> 38.558, 50.156, 62.948, 75.995, 91.972) # for testing >> tdat <- seq_along(ydat) # for testing >> >> An unscaled model is >> >> eunsc <- y ~ b1/(1+b2*exp(-b3*tt)) >> >> This problem looks simple, but has given lots of software grief over nearly 5 decades. In 1974 an >> extensive search had all commonly available software failing, which led to the code that evolved >> into nlsr, though there are plenty of cases where really awful code will luckily find a good >> solution. The issue is getting a solution and knowing it is reasonable. I suspect a Richards' >> model will be more difficult unless the OP has a lot of data and maybe some external information >> to fix or constrain some parameters. >> >> JN >> >> >>> On 2020-05-13 5:41 a.m., Peter Dalgaard wrote: >>> Shouldn't be hard to set up with nls(). (I kind of suspect that the Richards curve has more flexibility than data can resolve, especially the subset (Q,B,nu) seems highly related, but hey, it's your data...) >>> >>> -pd >>> >>>>> On 13 May 2020, at 11:26 , Christofer Bogaso <bogaso.christofer at gmail.com> wrote: >>>> >>>> Hi, >>>> >>>> Is there any R package to fit Richards' curve in the form of >>>> https://en.wikipedia.org/wiki/Generalised_logistic_function >>>> >>>> I found there is one package grofit, but currently defunct. >>>> >>>> Any pointer appreciated. >>>> >>>> ______________________________________________ >>>> R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see >>>> https://stat.ethz.ch/mailman/listinfo/r-help >>>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html >>>> and provide commented, minimal, self-contained, reproducible code. >>> >> >> ______________________________________________ >> R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see >> https://stat.ethz.ch/mailman/listinfo/r-help >> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html >> and provide commented, minimal, self-contained, reproducible code. >
Also, in the full curve referenced on Wikpedia, the parameters Q And M are confounded - you only need one or the other But not both. If you are using both and trying to estimate them both you will have problems. I have fitted these curves quite easily using the Solver in Excel. Bernard Sent from my iPhone so please excuse the spelling!"> On May 13, 2020, at 8:42 AM, J C Nash <profjcnash at gmail.com> wrote: > > ?The Richards' curve is analytic, so nlsr::nlxb() should work better than nls() for getting derivatives -- > the dreaded "singular gradient" error will likely stop nls(). Also likely, since even a 3-parameter > logistic can suffer from it (my long-standing Hobbs weed infestation problem below), is > that the Jacobian will be near-singular. And badly scaled. Nonlinear fitting problems essentially > have different scale in different portions of the parameter space. > > You may also want to "fix" or mask one or more parameters to reduce the dimensionality of the problem, > and nlsr::nlxb() can do that. > > The Hobbs problem has the following 12 data values for time points 1:12 > > # Data for Hobbs problem > ydat <- c(5.308, 7.24, 9.638, 12.866, 17.069, 23.192, 31.443, > 38.558, 50.156, 62.948, 75.995, 91.972) # for testing > tdat <- seq_along(ydat) # for testing > > An unscaled model is > > eunsc <- y ~ b1/(1+b2*exp(-b3*tt)) > > This problem looks simple, but has given lots of software grief over nearly 5 decades. In 1974 an > extensive search had all commonly available software failing, which led to the code that evolved > into nlsr, though there are plenty of cases where really awful code will luckily find a good > solution. The issue is getting a solution and knowing it is reasonable. I suspect a Richards' > model will be more difficult unless the OP has a lot of data and maybe some external information > to fix or constrain some parameters. > > JN > > >> On 2020-05-13 5:41 a.m., Peter Dalgaard wrote: >> Shouldn't be hard to set up with nls(). (I kind of suspect that the Richards curve has more flexibility than data can resolve, especially the subset (Q,B,nu) seems highly related, but hey, it's your data...) >> >> -pd >> >>>> On 13 May 2020, at 11:26 , Christofer Bogaso <bogaso.christofer at gmail.com> wrote: >>> >>> Hi, >>> >>> Is there any R package to fit Richards' curve in the form of >>> https://en.wikipedia.org/wiki/Generalised_logistic_function >>> >>> I found there is one package grofit, but currently defunct. >>> >>> Any pointer appreciated. >>> >>> ______________________________________________ >>> R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see >>> https://stat.ethz.ch/mailman/listinfo/r-help >>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html >>> and provide commented, minimal, self-contained, reproducible code. >> > > ______________________________________________ > R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code.