similar to: How does "rlm" in R decide its "w" weights for each IRLS iteration?

Displaying 20 results from an estimated 4000 matches similar to: "How does "rlm" in R decide its "w" weights for each IRLS iteration?"

2005 Mar 24
1
Robust multivariate regression with rlm
Dear Group, I am having trouble with using rlm on multivariate data sets. When I call rlm I get Error in lm.wfit(x, y, w, method = "qr") : incompatible dimensions lm on the same data sets seem to work well (see code example). Am I doing something wrong? I have already browsed through the forums and google but could not find any related discussions. I use Windows XP and R
2012 Jun 21
2
MGCV: Use of irls.reg option
Hi, In the help files in the ?mgcv package for the gam.control() function, there is an option irls.reg. The help files describe this option as: For most models this should be 0. The iteratively re-weighted least squares method by which GAMs are fitted can fail to converge in some circumstances. For example, data with many zeroes can cause problems in a model with a log link, because a mean of
2009 Dec 03
2
Avoiding singular fits in rlm
I keep coming back to this problem of singular fits in rlm (MASS library), but cannot figure out a good solution. I am fitting a linear model with a factor variable, like lm( Y ~ factorVar) and this works fine. lm knows to construct the contrast matrix the way I would expect, which puts the first factor as the baseline level. But when I try rlm( Y ~ factorVar) I get the message "'x'
2004 Apr 07
4
Problems with rlm
Dear all, When calling rlm with the following data, I get an error. (R v.1.8.1, WinXP Pro 2002 with service pack 1.) > d <- na.omit(data.frame(CPRATIO, HEIGHTZ, FAMILYID)) > c <- tapply(d$CPRATIO, d$FAMILYID, mean) > h <- tapply(d$HEIGHTZ, d$FAMILYID, mean) > c 1 2 3 6 7 9 10 11 6.000000 2.500000 3.250000
2011 Mar 14
1
discrepancy between lm and MASS:rlm
Dear R-devel, There seems to be a discrepancy in the order in which lm and rlm evaluate their arguments. This causes rlm to sometimes produce an error where lm is just fine. Here is a little script that illustrate the issue: > library(MASS) > ## create data > n <- 100 > dat <- data.frame(x=rep(c(-1,0,1), n), y=rnorm(3*n)) > > ## call lm, works fine > summary(lm(y ~
2010 Nov 08
1
Add values of rlm coefficients to xyplot
Hello, I have a simple xyplot with rlm lines. I would like to add the a and b coefficients (y=ax+b) of the rlm calculation in each panel. I know I can do it 'outside' the xyplot command but I would like to do all at the same time. I found some posts with the same question, but no answer. Is it impossible ? Thanks in advance for your help. Ptit Bleu. x11(15,12) xyplot(df1$col2 ~
2003 Oct 02
4
using a string as the formula in rlm
Hi, I am trying to build a series of rlm models. I have my data frame and the models will be built using various coulmns of the data frame. Thus a series of models would be m1 <- rlm(V1 ~ V2 + V3 + V4, data) m2 <- rlm(V1 ~ V2 + V5 + V7, data) m3 <- rlm(V1 ~ V2 + V8 + V9, data) I would like to automate this. Is it possible to use a string in place of the formula? I tried doing: fmla
2012 Jul 06
1
How to do goodness-of-fit diagnosis and model checking for rlm in R?
Hi all, I am reading the MASS book but it doesn't give examples about the diagnosis and model checking for rlm... My data is highly non-Gaussian so I am using rlm instead of lm. My questions are: 0. Are goodness-of-fit and model-checking using rlm completely the same as usual regression? 1. Please give me some pointers about how to do goodness-of-fit and residual diagnosis for
2008 May 14
1
rlm and lmrob error messages
Hello all, I'm using R2.7.0 (on Windows 2000) and I'm trying do run a robust regression on following model structure: model = "Y ~ x1*x2 / (x3 + x4 + x5 +x6)" where x1 and x2 are both factors (either 1 or 0) and x3.....x6 are numeric. The error code I get when running rlm(as.formula(model), data=daymean) is: error in rlm.default(x, y, weights, method = method, wt.method =
2008 Dec 08
1
residual standard error in rlm (MASS package)
Hi, I would appreciate of someone could explain how the residual standard error is computed for rlm models (MASS package). Usually, one would expect to get the residual standard error by > sqrt(sum((y-fitted(fm))^2)/(n-2)) where y is the response, fm a linear model with an intercept and slope for x and n the number of observations. This does not seem to work for rlm models and I am wondering
2004 Oct 11
3
split and rlm
Hello, I'm trying to do a little rlm of some data that looks like this: UNIT COHORT perdo adjodds 1010 96 0.39890 1.06894 1010 97 0.48113 1.57500 1010 98 0.36328 1.21498 1010 99 0.44391 1.38608 It works fine like this: rlm(perdo ~ COHORT, psi=psisquare) But the problem is that I have about 100 UNITs, and I want to do a
2008 Jan 19
1
How do we get two-tailed p-values for rlm?
How do we get 2-tailed p-values for the rlm summary? I'm using the following: > fit <- rlm(oatRT ~ oatoacData$erp, psi=psi.bisquare, maxit=100, na.action='na.omit') > fitsum <- summary(fit, cor=F) > print(fitsum) Call: rlm(formula = oatRT ~ oatoacData$erp, psi = psi.bisquare, maxit = 100, na.action = "na.omit") Residuals: Min 1Q Median
2010 Dec 13
1
Wrong contrast matrix for nested factors in lm(), rlm(), and lmRob()
This message also reports wrong estimates produced by lmRob.fit.compute() for nested factors when using the correct contrast matrix. And in these respects, I have found that S-Plus behaves the same way as R. Using the three available contrast types (sum, treatment, helmert) with lm() or lm.fit(), but just contr.sum with rlm() and lmRob(), and small examples, I generated contrast matrices for
2007 Jun 07
3
rlm results on trellis plot
How do I add to a trellis plot the best fit line from a robust fit? I can use panel.lm to add a least squares fit, but there is no panel.rlm function. -- Alan S Barnett <asb at mail.nih.gov> NIMH/CBDB
2005 Mar 27
1
p values when using rlm
R 2.0.1 Linux I am using rlm() to fit a model, e.g. fit1<-rlm(y~x). My model is more complex than the one shown. When I enter summary(fit1) I get estimates for the model's coefficients along with their SEs, and t values, but no p values. The p value column is blank. Similarly, when I enter anova(fit1) I get DF, Sum Sq, Mean Sq, but the column for F value and Pr(>F) are blank. Any
2005 Feb 25
1
vcov on result of rlm() yields "-- please report!" (PR#7707)
Dear r-bugs, I looked over the FAQ. Hope I'm reporting this correctly. I ran this on both solaris and windows. I've provided terminal snapshots which include how R was called from the command line, and the result of version at the R prompt. I have attached the .r file, and the data file and the output snapshots. Below also find everything except only a few lines of the data file. Note
2005 Nov 13
4
Robust Non-linear Regression
Hi, I'm trying to use Robust non-linear regression to fit dose response curves. Maybe I didnt look good enough, but I dind't find robust methods for NON linear regression implemented in R. A method that looked good to me but is unfortunately not (yet) implemented in R is described in http://www.graphpad.com/articles/RobustNonlinearRegression_files/frame.htm
2010 Aug 17
0
Singular error in rlm
I am absolutely new to R and I am aware of only a few basic command lines. I was running a robust regression in R, using the following command line library (MASS) rfmodel2 <- rlm (TotalEmployment_2005 ~ ALABAMA + MISSISSIPPI + LOUISIANA + TotalEmployment_2000 + PCWhitePop_2005 + UnemploymentRate_2005 + PCUrbanPop2000 + PCPeopleWithACollegeDegree_2000 +
2012 Nov 22
1
help in M-estimator by R
hi guys and gals ... How are you all ... i have to do something in robust regression by R programm , and i have some problems as following: *the first :* suppose w(r) =1/(1 r^2) and r <- c(7.01,2.07,7.061,5.607,8.502,54.909,12.222) and i want to exclude some values from r so that (abs(r)>4.9 )... after ,i want to used (w) to get on coefficients beta0 and beta1 (B1 <-
2005 Dec 22
1
Huber location estimate
We have a choice when calculating the Huber location estimate: > set.seed(221205) > y <- 7 + 3*rt(30,1) > library(MASS) > huber(y)$mu [1] 5.9117 > coefficients(rlm(y~1)) (Intercept) 5.9204 I was surprised to get two different results. The function huber() works directly with the definition whereas rlm() uses iteratively reweighted least squares. My surprise is