Displaying 20 results from an estimated 5000 matches similar to: "pros and cons of "robust regression"? (i.e. rlm vs lm)"
2003 Feb 24
3
bwplot stats question
Hi List,
Just wondering where the documentation exists for the statistics which
makeup the bwplot.
I'm guessing that if R is like similar products that the graph is
constructed as
The median is the filled circle. The box surrounding the filled circle
depicts the 25th and 75th quartile. The range of values is given by the
dotted lines (?whiskers?) outside of each box, and possible
2003 Jul 30
2
robust regression
Hi,
trying to do a robudt regression of a two-way linear model, I keep
getting the following error:
> lqs(obs ~ y + s -1,method="lms", contrasts=list(s=("contr.sum")))
Error: lqs failed: all the samples were singular
Robust regression with M-estimators works (also regular least square
fits, of course):
rlm.formula(formula = obs ~ y + s - 1, method = "M",
2008 Dec 08
1
residual standard error in rlm (MASS package)
Hi,
I would appreciate of someone could explain how the residual standard
error is computed for rlm models (MASS package). Usually, one would
expect to get the residual standard error by
> sqrt(sum((y-fitted(fm))^2)/(n-2))
where y is the response, fm a linear model with an intercept and slope
for x and n the number of observations. This does not seem to work for
rlm models and I am wondering
2005 Aug 23
1
Robust M-Estimator Comparison
Hello,
I'm learning about robust M-estimators right now and had settled on the
"Huber Proposal 2" as implemented in MASS, but further reading made clear,
that at least 2 further weighting functions (Hampel, Tukey bisquare) exist.
In a post from B.D. Ripley going back to 1999 I found the following quote:
>> 2) Would huber() give me results that are similar (i.e., close
2005 Apr 22
2
Hoaglin Outlier Method
I am a new user of R so please bear with me. I have reviewed some R books,
FAQs and such but the volume of material is great. I am in the process of
porting my current SAS and SVS Script code to Lotus Approach, R and
WordPerfect.
My question is, can you help me determine the best R method to implement
the Hoaglin Outlier Method? It is used in the Appendix A and B of the fo
llowing link.
2006 Jun 22
1
High breakdown/efficiency statistics -- was RE: Rosner's test [Broadcast]
What would be nice is to have something like a "robust" task view...
Andy
From: Berton Gunter
>
> Many thanks for this Martin. There now are several packages
> with what appear to be overlapping functions (or at least
> algorithms). Besides those you mentioned, "robust" and
> "roblm" are at least two others. Any recommendations about
> how or
2017 May 30
3
stats::line() does not produce correct Tukey line when n mod 6 is 2 or 3
>>>>> Serguei Sokol <sokol at insa-toulouse.fr>
>>>>> on Tue, 30 May 2017 16:01:17 +0200 writes:
> Le 30/05/2017 ? 09:33, Martin Maechler a ?crit : ...
>> However, even after the patch, The example from the SO
>> post differs from the result of Richie Cotton's
>> function...
> The explanation is quite simple.
2017 May 30
2
stats::line() does not produce correct Tukey line when n mod 6 is 2 or 3
>>>>> Serguei Sokol <sokol at insa-toulouse.fr>
>>>>> on Mon, 29 May 2017 15:28:12 +0200 writes:
> Sorry, I have seen it too late that we had different tab
> width in the original file and my editor. Here is the
> patch with all white spaces instead of mixing tabs and
> white spaces.
thank you - it still gives quite a few
2012 Jul 13
2
Fitting data and removing outliers
What I'm trying to do is create best fit line in R for a set of data points and then remove all the outliers to re-create a best fit. I can't use IQR because the outliers I have in mind are easily within the range, but way out of line for the best fit, which is ruining the fit. I'd rather throw out those points all together.
Thanks!
[[alternative HTML version deleted]]
2005 Mar 27
1
p values when using rlm
R 2.0.1
Linux
I am using rlm() to fit a model, e.g. fit1<-rlm(y~x). My model is more
complex than the one shown.
When I enter summary(fit1)
I get estimates for the model's coefficients along with their SEs, and
t values, but no p values. The p value column is blank.
Similarly, when I enter anova(fit1) I get DF, Sum Sq, Mean Sq, but the
column for F value and Pr(>F) are blank.
Any
2003 Oct 02
4
using a string as the formula in rlm
Hi,
I am trying to build a series of rlm models. I have my data frame and
the models will be built using various coulmns of the data frame.
Thus a series of models would be
m1 <- rlm(V1 ~ V2 + V3 + V4, data)
m2 <- rlm(V1 ~ V2 + V5 + V7, data)
m3 <- rlm(V1 ~ V2 + V8 + V9, data)
I would like to automate this. Is it possible to use a string in place
of the formula?
I tried doing:
fmla
2005 Mar 24
1
Robust multivariate regression with rlm
Dear Group,
I am having trouble with using rlm on multivariate data sets. When I
call rlm I get
Error in lm.wfit(x, y, w, method = "qr") :
incompatible dimensions
lm on the same data sets seem to work well (see code example). Am I
doing something wrong?
I have already browsed through the forums and google but could not find
any related discussions.
I use Windows XP and R
2010 Nov 08
1
Add values of rlm coefficients to xyplot
Hello,
I have a simple xyplot with rlm lines.
I would like to add the a and b coefficients (y=ax+b) of the rlm calculation
in each panel.
I know I can do it 'outside' the xyplot command but I would like to do all
at the same time.
I found some posts with the same question, but no answer.
Is it impossible ?
Thanks in advance for your help.
Ptit Bleu.
x11(15,12)
xyplot(df1$col2 ~
2008 May 14
1
rlm and lmrob error messages
Hello all,
I'm using R2.7.0 (on Windows 2000) and I'm trying do run a robust
regression on following model structure:
model = "Y ~ x1*x2 / (x3 + x4 + x5 +x6)"
where x1 and x2 are both factors (either 1 or 0) and x3.....x6 are numeric.
The error code I get when running rlm(as.formula(model), data=daymean) is:
error in rlm.default(x, y, weights, method = method, wt.method =
2009 Dec 03
2
Avoiding singular fits in rlm
I keep coming back to this problem of singular fits in rlm (MASS library),
but cannot figure out a good solution.
I am fitting a linear model with a factor variable, like
lm( Y ~ factorVar)
and this works fine. lm knows to construct the contrast matrix the way I
would expect, which puts the first factor as the baseline level.
But when I try
rlm( Y ~ factorVar)
I get the message "'x'
2007 Jun 07
3
rlm results on trellis plot
How do I add to a trellis plot the best fit line from a robust fit? I
can use panel.lm to add a least squares fit, but there is no panel.rlm
function.
--
Alan S Barnett <asb at mail.nih.gov>
NIMH/CBDB
2011 Mar 14
1
discrepancy between lm and MASS:rlm
Dear R-devel,
There seems to be a discrepancy in the order in which lm and rlm evaluate their arguments. This causes rlm to sometimes produce an error where lm is just fine.
Here is a little script that illustrate the issue:
> library(MASS)
> ## create data
> n <- 100
> dat <- data.frame(x=rep(c(-1,0,1), n), y=rnorm(3*n))
>
> ## call lm, works fine
> summary(lm(y ~
2007 Sep 04
1
Robust linear models and unequal variance
Hi all,
I have probably a basic question, but I can't seem to find the answer in
the literature or in the R-archives.
I would like to do a robust ANCOVA (using either rlm or lmRob of the
MASS and robust packages) - my response variable deviates slightly from
normal and I have some "outliers". The data consist of 2 factor
variables and 3-5 covariates (fdepending on the model).
2012 Jul 06
1
How to do goodness-of-fit diagnosis and model checking for rlm in R?
Hi all,
I am reading the MASS book but it doesn't give examples about the diagnosis
and model checking for rlm...
My data is highly non-Gaussian so I am using rlm instead of lm.
My questions are:
0. Are goodness-of-fit and model-checking using rlm completely the same as
usual regression?
1.
Please give me some pointers about how to do goodness-of-fit and
residual diagnosis for
2012 Jul 18
1
How does "rlm" in R decide its "w" weights for each IRLS iteration?
Hi all,
I am also confused about the manual:
a. The input arguments:
wt.method are the weights case weights (giving the relative importance of
case, so a weight of 2 means there are two of these) or the inverse of the
variances, so a weight of two means this error is half as variable?
w (optional) initial down-weighting for each case.
init (optional) initial values for the