Hi, I am working with a large dataset of neotropical birds, and am trying to partition the variance in log(body mass) within different taxonomic levels. To better explain what I mean, the taxonomic levels are species, genus, family, and order. Species are within genera, genera are within famillies, and famlies are within orders. Sample data look like this: mass species genus family order.2 377.0000 Geranospiza caerulescens Geranospiza Accipitridae Accipitriformes 213.1667 Harpagus bidentatus Harpagus Accipitridae Accipitriformes 500.0000 Leptodon cayanensis Leptodon Accipitridae Accipitriformes 1750.0000 Penelope albipennis Penelope Cracidae Galliformes 278.0000 Leucopternis semiplumbeus Leucopternis Accipitridae Accipitriformes 66.2500 Notharchus pectoralis Notharchus Bucconidae Galibuliformes 213.1667 Harpagus bidentatus Harpagus Accipitridae Accipitriformes 31.0000 Gymnopithys leucaspis Gymnopithys Thamnophilidae Passeriformes 31.0000 Gymnopithys leucaspis Gymnopithys Thamnophilidae Passeriformes I want to know how much variability in log(mass) there is at the species level, genus level, family level, and order level. The code that I have been using to do this looks like: bm.full <- lmer(log(mass) ~ 1 + (1|order.2/family/genus/species)) however, it crashes R every time, whether I run it on my laptop with 4 gb RAM or a desktop with 8 gb RAM. If I remove any of the taxonomic levels, though, it does run and produces reasonable output on either machine. There are 1943 observations (different studies occassionaly give different masses for the same species), 791 species, 381 genera, 60 families, and 21 orders. If I am able to modestly increase RAM, e.g. to 16 GB, is it likely that R will be able to handle the model, or is there such a dramatic increase in the computation required with four nested groups that it simply won't be possible? Thank you for your advice, David [[alternative HTML version deleted]]