Duncan, Laura
2017-Feb-27 19:05 UTC
[R] Metafor multilevel metaregression: total variance increases when moderator added?
Hi there, I am running a two level multilevel meta-regression of 170 estimates nested within 3 informants nested within 26 studies. I run the null model to get a pooled estimate with random effects at the informant level and study level. Then I test a series of potential moderators (one at a time, given small number of studies and adjust p-values for multiple testing). I use: (sum(Model1$sigma2) - sum(Model2$sigma2)) / sum(Model1$sigma2) to compute the proportional reduction in the total variance from here: http://stackoverflow.com/questions/22356450/getting-r-squared-from-a-mixed-effects-multilevel-model-in-metafor For one moderator, I get a negative value for reduced total variance and an unexpected negative coefficient. Based on Wolfgang's response in the link above this is possible "depending on the size of your dataset, those variance components may not be estimated very precisely and that can lead to such counter-intuitive results". I am trying to diagnose why this model is not being estimated properly and why I am getting an unexpected negative result. When I remove the second level from the model and run a single-level random effects models of 170 estimates nested within 26 studies, the coefficient is positive and as we would expect. Does anyone have any suggestions for what might be going on or how I might diagnose the problem with this model? Thanks, Laura Laura Duncan, M.A. Research Coordinator Offord Centre for Child Studies McMaster University Tel: 905 525 9140 x21504 Fax: 905 574 6665 duncanlj at mcmaster.ca ontariochildhealthstudy.ca<www.ontariochildhealthstudy.ca> offordcentre.com Mailing Address Courier Address 1280 Main St. W. MIP 201A 175 Longwood Rd. S. MIP 201A Hamilton, Ontario L8S 4K1 Hamilton, Ontario L8P 0A1 [[alternative HTML version deleted]]
Viechtbauer Wolfgang (SP)
2017-Feb-28 12:54 UTC
[R] Metafor multilevel metaregression: total variance increases when moderator added?
Very difficult to diagnose what is going on without actually seeing the data. But as I said on CV: Depending on the data, the variance components may not be estimated precisely, so negative values for those kinds of pseudo-R^2 statistics are quite possible. In fact, if a particular moderator is actually unrelated to the outcomes, then in roughly 50% of the cases, the pseudo-R^2 statistic will be negative. See also: Lopez-Lopez, J. A., Marin-Martinez, F., Sanchez-Meca, J., Van den Noortgate, W., & Viechtbauer, W. (2014). Estimation of the predictive power of the model in mixed-effects meta-regression: A simulation study. British Journal of Mathematical and Statistical Psychology, 67(1), 30-48. We only examined the standard mixed-effects meta-regression model with a single moderator, but found that the pseudo-R^2 statistic can be all over the place unless k is quite large. Now you seem to have a larger number of estimates (170), but these are nested in 'only' 26 studies. So, I suspect that the estimate-level variance component is estimated fairly precisely, but not the study-level variance component. You may want to examine the profile plots (with the profile() function) and/or get (profile-likelihood) CIs of the variance components (using the confint() function). Probably the CI for the study-level variance component is quite wide. Best, Wolfgang -- Wolfgang Viechtbauer, Ph.D., Statistician | Department of Psychiatry and Neuropsychology | Maastricht University | P.O. Box 616 (VIJV1) | 6200 MD Maastricht, The Netherlands | +31 (43) 388-4170 | http://www.wvbauer.com>-----Original Message----- >From: R-help [mailto:r-help-bounces at r-project.org] On Behalf Of Duncan, >Laura >Sent: Monday, February 27, 2017 20:05 >To: r-help at r-project.org >Subject: [R] Metafor multilevel metaregression: total variance increases >when moderator added? > >Hi there, > >I am running a two level multilevel meta-regression of 170 estimates >nested within 3 informants nested within 26 studies. I run the null model >to get a pooled estimate with random effects at the informant level and >study level. > >Then I test a series of potential moderators (one at a time, given small >number of studies and adjust p-values for multiple testing). I use: >(sum(Model1$sigma2) - sum(Model2$sigma2)) / sum(Model1$sigma2) >to compute the proportional reduction in the total variance from here: >http://stackoverflow.com/questions/22356450/getting-r-squared-from-a- >mixed-effects-multilevel-model-in-metafor > >For one moderator, I get a negative value for reduced total variance and >an unexpected negative coefficient. Based on Wolfgang's response in the >link above this is possible "depending on the size of your dataset, those >variance components may not be estimated very precisely and that can lead >to such counter-intuitive results". > >I am trying to diagnose why this model is not being estimated properly and >why I am getting an unexpected negative result. When I remove the second >level from the model and run a single-level random effects models of 170 >estimates nested within 26 studies, the coefficient is positive and as we >would expect. > >Does anyone have any suggestions for what might be going on or how I might >diagnose the problem with this model? > >Thanks, >Laura > >Laura Duncan, M.A. >Research Coordinator >Offord Centre for Child Studies >McMaster University > >Tel: 905 525 9140 x21504 >Fax: 905 574 6665 >duncanlj at mcmaster.ca >ontariochildhealthstudy.ca<www.ontariochildhealthstudy.ca> >offordcentre.com > >Mailing Address Courier >Address >1280 Main St. W. MIP 201A 175 Longwood Rd. S. >MIP 201A >Hamilton, Ontario L8S 4K1 Hamilton, Ontario >L8P 0A1
Michael Dewey
2017-Mar-01 09:17 UTC
[R] Metafor multilevel metaregression: total variance increases when moderator added?
Dear Laura If you are unable for some reason to share the data why not incorporate the output into an e-mail (and please turn of HTML as it mangles everything). Putting the plots from profiling somewhere we can read them would be a useful addition. This looks at first glance one of those situations where sadly one has insufficient data for the models one would like to fit. We feel your pain. On 28/02/2017 12:54, Viechtbauer Wolfgang (SP) wrote:> Very difficult to diagnose what is going on without actually seeing the data. But as I said on CV: Depending on the data, the variance components may not be estimated precisely, so negative values for those kinds of pseudo-R^2 statistics are quite possible. In fact, if a particular moderator is actually unrelated to the outcomes, then in roughly 50% of the cases, the pseudo-R^2 statistic will be negative. > > See also: > > Lopez-Lopez, J. A., Marin-Martinez, F., Sanchez-Meca, J., Van den Noortgate, W., & Viechtbauer, W. (2014). Estimation of the predictive power of the model in mixed-effects meta-regression: A simulation study. British Journal of Mathematical and Statistical Psychology, 67(1), 30-48. > > We only examined the standard mixed-effects meta-regression model with a single moderator, but found that the pseudo-R^2 statistic can be all over the place unless k is quite large. > > Now you seem to have a larger number of estimates (170), but these are nested in 'only' 26 studies. So, I suspect that the estimate-level variance component is estimated fairly precisely, but not the study-level variance component. You may want to examine the profile plots (with the profile() function) and/or get (profile-likelihood) CIs of the variance components (using the confint() function). Probably the CI for the study-level variance component is quite wide. > > Best, > Wolfgang >-- Michael http://www.dewey.myzen.co.uk/home.html