Ramiro Barrantes
2020-Aug-11 19:35 UTC
[R] Huge speed performance difference when using non-trivial fixed effects in NLMER vs NLME
Following Ben Bolker's methodology (described here https://rpubs.com/bbolker/3423) I incorporated non-trivial fixed effects in NLMER for a four-parameter logistic. I placed a reproducible example here: https://rpubs.com/ramirob/648103 To summarize the question, if we have a dataset with individuals in groups where we have group-specific fixed effects, NLME's performance remains the same: ## [1] "NLME Time Required for data2Groups: 0.0458040237426758" fit3Groups <- fitNLME(data3Groups,initialValues3Groups) ## [1] "NLME Time Required for data3Groups: 0.0375699996948242" fit4Groups <- fitNLME(data4Groups,initialValues4Groups) ## [1] "NLME Time Required for data4Groups: 0.0526559352874756" fit5Groups <- fitNLME(data5Groups,initialValues5Groups) ## [1] "NLME Time Required for data5Groups: 0.0502560138702393" But when we do the analogous thing in NLMER, the performance increases with increasing number of groups: ## [1] "Time required for the data2Groups: 0.404773950576782" fitNlmer3Groups <- fitNlmer(data3Groups, initialValues3Groups) ## [1] "Time required for the data3Groups: 0.579570055007935" fitNlmer4Groups <- fitNlmer(data4Groups, initialValues4Groups) ## [1] "Time required for the data4Groups: 0.957509994506836" fitNlmer5Groups <- fitNlmer(data5Groups, initialValues5Groups) ## [1] "Time required for the data5Groups: 1.68412184715271" In addition, NLMER is much slower in general. This is just a short example, but for more complicated cases the differences in performance are huge (minutes vs seconds). Is NLMER "worth the wait" (e.g. less fragile, better convergence, etc) when trying to do non-trivial fixed effects? Is there a better methodology than the one described by Ben Bolker back in 2013? Any insight appreciated. Again, you can see a reproducible example here https://rpubs.com/ramirob/648103 Thank you! [[alternative HTML version deleted]]
Bert Gunter
2020-Aug-11 23:42 UTC
[R] Huge speed performance difference when using non-trivial fixed effects in NLMER vs NLME
This should be posted on the r-sig-mixed-models list rather than here. The interest and expertise you seek is more likely to be found there. Bert Gunter "The trouble with having an open mind is that people keep coming along and sticking things into it." -- Opus (aka Berkeley Breathed in his "Bloom County" comic strip ) On Tue, Aug 11, 2020 at 12:39 PM Ramiro Barrantes < ramiro at precisionbioassay.com> wrote:> Following Ben Bolker's methodology (described here > https://rpubs.com/bbolker/3423) I incorporated non-trivial fixed effects > in NLMER for a four-parameter logistic. I placed a reproducible example > here: https://rpubs.com/ramirob/648103 > > > To summarize the question, if we have a dataset with individuals in groups > where we have group-specific fixed effects, NLME's performance remains the > same: > > ## [1] "NLME Time Required for data2Groups: 0.0458040237426758" > > fit3Groups <- fitNLME(data3Groups,initialValues3Groups) > > ## [1] "NLME Time Required for data3Groups: 0.0375699996948242" > > fit4Groups <- fitNLME(data4Groups,initialValues4Groups) > > ## [1] "NLME Time Required for data4Groups: 0.0526559352874756" > > fit5Groups <- fitNLME(data5Groups,initialValues5Groups) > > ## [1] "NLME Time Required for data5Groups: 0.0502560138702393" > > > But when we do the analogous thing in NLMER, the performance increases > with increasing number of groups: > > > ## [1] "Time required for the data2Groups: 0.404773950576782" > > fitNlmer3Groups <- fitNlmer(data3Groups, initialValues3Groups) > > ## [1] "Time required for the data3Groups: 0.579570055007935" > > fitNlmer4Groups <- fitNlmer(data4Groups, initialValues4Groups) > > ## [1] "Time required for the data4Groups: 0.957509994506836" > > fitNlmer5Groups <- fitNlmer(data5Groups, initialValues5Groups) > > ## [1] "Time required for the data5Groups: 1.68412184715271" > > > In addition, NLMER is much slower in general. This is just a short > example, but for more complicated cases the differences in performance are > huge (minutes vs seconds). > > > Is NLMER "worth the wait" (e.g. less fragile, better convergence, etc) > when trying to do non-trivial fixed effects? Is there a better methodology > than the one described by Ben Bolker back in 2013? > > > Any insight appreciated. Again, you can see a reproducible example here > https://rpubs.com/ramirob/648103 > > Thank you! > > > > > [[alternative HTML version deleted]] > > ______________________________________________ > R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide > http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. >[[alternative HTML version deleted]]