I am interested in doing Reduced Major Axis regression in R. Has anyone implemented this? I am aware of how to calculate the slope using RMA (and previous threads on this list about RMA), but I would like to estimate break points in data where there are two (or more) distinct trends. This is straightforward with Ordinary Least Squares (iteratively try x values, set up dummy variables, find x that maximizes R^2). But it seems like it would be more difficult with RMA. The critical thing, it seems to me, would be how to estimate R^2 for the more complex model. Once the break point is identified, it would also be nice to also estimate the two (or more trends). Thanks for any help you can give. -pat Dr. Patrick D. Lorch Zoology Dept. W: 416-978-0172 University of Toronto F: 416-978-8532 Ramsay Wright Labs plorch at zoo.utoronto.ca 25 Harbord St. http://www.zoo.utoronto.ca/lrowe/plorch Toronto, Ontario M5S 3G5 CANADA Public encryption key available upon request. -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._