similar to: vcov on result of rlm() yields "-- please report!" (PR#7707)

Displaying 20 results from an estimated 1000 matches similar to: "vcov on result of rlm() yields "-- please report!" (PR#7707)"

2010 Sep 25
2
Uncertainty propagation
I have a small model running under R. This is basically running various power-law relations on a variable (in this case water level in a river) changing spatially and through time. I'd like to include some kind of error propagation to this. My first intention was to use a kind of monte carlo routine and run the model many times by changing the power law parameters. These power laws were
2017 Sep 13
3
vcov and survival
Dear Terry, Even the behaviour of lm() and glm() isn't entirely consistent. In both cases, singularity results in NA coefficients by default, and these are reported in the model summary and coefficient vector, but not in the coefficient covariance matrix: ---------------- > mod.lm <- lm(Employed ~ GNP + Population + I(GNP + Population), + data=longley) >
2017 Sep 14
6
vcov and survival
>>>>> Martin Maechler <maechler at stat.math.ethz.ch> >>>>> on Thu, 14 Sep 2017 10:13:02 +0200 writes: >>>>> Fox, John <jfox at mcmaster.ca> >>>>> on Wed, 13 Sep 2017 22:45:07 +0000 writes: >> Dear Terry, >> Even the behaviour of lm() and glm() isn't entirely consistent. In both cases,
2003 Oct 02
4
using a string as the formula in rlm
Hi, I am trying to build a series of rlm models. I have my data frame and the models will be built using various coulmns of the data frame. Thus a series of models would be m1 <- rlm(V1 ~ V2 + V3 + V4, data) m2 <- rlm(V1 ~ V2 + V5 + V7, data) m3 <- rlm(V1 ~ V2 + V8 + V9, data) I would like to automate this. Is it possible to use a string in place of the formula? I tried doing: fmla
2004 Apr 07
4
Problems with rlm
Dear all, When calling rlm with the following data, I get an error. (R v.1.8.1, WinXP Pro 2002 with service pack 1.) > d <- na.omit(data.frame(CPRATIO, HEIGHTZ, FAMILYID)) > c <- tapply(d$CPRATIO, d$FAMILYID, mean) > h <- tapply(d$HEIGHTZ, d$FAMILYID, mean) > c 1 2 3 6 7 9 10 11 6.000000 2.500000 3.250000
2004 Oct 11
3
split and rlm
Hello, I'm trying to do a little rlm of some data that looks like this: UNIT COHORT perdo adjodds 1010 96 0.39890 1.06894 1010 97 0.48113 1.57500 1010 98 0.36328 1.21498 1010 99 0.44391 1.38608 It works fine like this: rlm(perdo ~ COHORT, psi=psisquare) But the problem is that I have about 100 UNITs, and I want to do a
2007 Nov 29
1
relative importance of predictors
Hei Group, I want to compare the relative importance of predictors in a multiple linear regression y~a+bx1+cx2... However, bptest indicates heteroskedasticity of my model. I therefore perform a robust regression (rlm), in combination with bootstrapping (as outlined in J. Fox, Bootstrapping Regression Models). Now I want to compare the relative importance of my predictors. Can I rely on the
2009 Dec 03
2
Avoiding singular fits in rlm
I keep coming back to this problem of singular fits in rlm (MASS library), but cannot figure out a good solution. I am fitting a linear model with a factor variable, like lm( Y ~ factorVar) and this works fine. lm knows to construct the contrast matrix the way I would expect, which puts the first factor as the baseline level. But when I try rlm( Y ~ factorVar) I get the message "'x'
2011 Mar 14
1
discrepancy between lm and MASS:rlm
Dear R-devel, There seems to be a discrepancy in the order in which lm and rlm evaluate their arguments. This causes rlm to sometimes produce an error where lm is just fine. Here is a little script that illustrate the issue: > library(MASS) > ## create data > n <- 100 > dat <- data.frame(x=rep(c(-1,0,1), n), y=rnorm(3*n)) > > ## call lm, works fine > summary(lm(y ~
2005 Mar 24
1
Robust multivariate regression with rlm
Dear Group, I am having trouble with using rlm on multivariate data sets. When I call rlm I get Error in lm.wfit(x, y, w, method = "qr") : incompatible dimensions lm on the same data sets seem to work well (see code example). Am I doing something wrong? I have already browsed through the forums and google but could not find any related discussions. I use Windows XP and R
2010 Nov 08
1
Add values of rlm coefficients to xyplot
Hello, I have a simple xyplot with rlm lines. I would like to add the a and b coefficients (y=ax+b) of the rlm calculation in each panel. I know I can do it 'outside' the xyplot command but I would like to do all at the same time. I found some posts with the same question, but no answer. Is it impossible ? Thanks in advance for your help. Ptit Bleu. x11(15,12) xyplot(df1$col2 ~
2008 May 14
1
rlm and lmrob error messages
Hello all, I'm using R2.7.0 (on Windows 2000) and I'm trying do run a robust regression on following model structure: model = "Y ~ x1*x2 / (x3 + x4 + x5 +x6)" where x1 and x2 are both factors (either 1 or 0) and x3.....x6 are numeric. The error code I get when running rlm(as.formula(model), data=daymean) is: error in rlm.default(x, y, weights, method = method, wt.method =
2007 Jun 07
3
rlm results on trellis plot
How do I add to a trellis plot the best fit line from a robust fit? I can use panel.lm to add a least squares fit, but there is no panel.rlm function. -- Alan S Barnett <asb at mail.nih.gov> NIMH/CBDB
2012 Jul 06
1
How to do goodness-of-fit diagnosis and model checking for rlm in R?
Hi all, I am reading the MASS book but it doesn't give examples about the diagnosis and model checking for rlm... My data is highly non-Gaussian so I am using rlm instead of lm. My questions are: 0. Are goodness-of-fit and model-checking using rlm completely the same as usual regression? 1. Please give me some pointers about how to do goodness-of-fit and residual diagnosis for
2012 Jul 18
1
How does "rlm" in R decide its "w" weights for each IRLS iteration?
Hi all, I am also confused about the manual: a. The input arguments: wt.method are the weights case weights (giving the relative importance of case, so a weight of 2 means there are two of these) or the inverse of the variances, so a weight of two means this error is half as variable? w (optional) initial down-weighting for each case. init (optional) initial values for the
2006 Apr 06
5
pros and cons of "robust regression"? (i.e. rlm vs lm)
Can anyone comment or point me to a discussion of the pros and cons of robust regressions, vs. a more "manual" approach to trimming outliers and/or "normalizing" data used in regression analysis?
2004 Jun 11
1
comparing regression slopes
Dear List, I used rlm to calculate two regression models for two data sets (rlm due to two outlying values in one of the data sets). Now I want to compare the two regression slopes. I came across some R-code of Spencer Graves in reply to a similar problem: http://www.mail-archive.com/r-help at stat.math.ethz.ch/msg06666.html The code was: > df1 <- data.frame(x=1:10, y=1:10+rnorm(10))
2008 Dec 08
1
residual standard error in rlm (MASS package)
Hi, I would appreciate of someone could explain how the residual standard error is computed for rlm models (MASS package). Usually, one would expect to get the residual standard error by > sqrt(sum((y-fitted(fm))^2)/(n-2)) where y is the response, fm a linear model with an intercept and slope for x and n the number of observations. This does not seem to work for rlm models and I am wondering
2005 Dec 22
1
Huber location estimate
We have a choice when calculating the Huber location estimate: > set.seed(221205) > y <- 7 + 3*rt(30,1) > library(MASS) > huber(y)$mu [1] 5.9117 > coefficients(rlm(y~1)) (Intercept) 5.9204 I was surprised to get two different results. The function huber() works directly with the definition whereas rlm() uses iteratively reweighted least squares. My surprise is
2010 Dec 13
1
Wrong contrast matrix for nested factors in lm(), rlm(), and lmRob()
This message also reports wrong estimates produced by lmRob.fit.compute() for nested factors when using the correct contrast matrix. And in these respects, I have found that S-Plus behaves the same way as R. Using the three available contrast types (sum, treatment, helmert) with lm() or lm.fit(), but just contr.sum with rlm() and lmRob(), and small examples, I generated contrast matrices for