similar to: location.m in R?

Displaying 20 results from an estimated 20000 matches similar to: "location.m in R?"

2012 Nov 22
1
help in M-estimator by R
hi guys and gals ... How are you all ... i have to do something in robust regression by R programm , and i have some problems as following: *the first :* suppose w(r) =1/(1 r^2) and r <- c(7.01,2.07,7.061,5.607,8.502,54.909,12.222) and i want to exclude some values from r so that (abs(r)>4.9 )... after ,i want to used (w) to get on coefficients beta0 and beta1 (B1 <-
2000 Dec 12
1
[Fwd: R code and robust regression]
-------------- next part -------------- An embedded message was scrubbed... From: Hella Heikki <Heikki.Hella at bof.fi> Subject: R code and robust regression Date: Tue, 12 Dec 2000 12:12:14 +0200 Size: 2187 Url: https://stat.ethz.ch/pipermail/r-help/attachments/20001212/c3361c7d/attachment-0001.mht
2005 Dec 22
1
Huber location estimate
We have a choice when calculating the Huber location estimate: > set.seed(221205) > y <- 7 + 3*rt(30,1) > library(MASS) > huber(y)$mu [1] 5.9117 > coefficients(rlm(y~1)) (Intercept) 5.9204 I was surprised to get two different results. The function huber() works directly with the definition whereas rlm() uses iteratively reweighted least squares. My surprise is
2003 Jul 30
2
robust regression
Hi, trying to do a robudt regression of a two-way linear model, I keep getting the following error: > lqs(obs ~ y + s -1,method="lms", contrasts=list(s=("contr.sum"))) Error: lqs failed: all the samples were singular Robust regression with M-estimators works (also regular least square fits, of course): rlm.formula(formula = obs ~ y + s - 1, method = "M",
2012 Jan 23
1
R not giving significance tests for coefficients/estimates?
> 3x4 Error: unexpected symbol in "3x4" R has no idea that you equate "x" as multiplication.. use an astrix > 3*4 [1] 12 dominic wrote > > This is basically my code: > > library(MASS) > lmsreg(formula = b0 ~ b1 + b3 + b1xb2, data=mydata) > > b1xb2 is an interaction but it was the centered value for a continuous > variable times a
2004 Jul 05
2
nonlinear regression with M estimation
Hi All, Could any one tells me if R or S has the capacity to fit nonlinear regression with Huber's M estimation? Any suggestion is appreciated. I was aware of 'rlm' in MASS library for robust linear regression and 'nls' for nonlinear least squares regression, but did not seem to be able to find robust non-linear regression function. Thanks and regards, Ray Liu
2006 Feb 21
3
How to get around heteroscedasticity with non-linear leas t squares in R?
Your understanding isn't similar to mine. Mine says robust/resistant methods are for data with heavy tails, not heteroscedasticity. The common ways to approach heteroscedasticity are transformation and weighting. The first is easy and usually quite effective for dose-response data. The second is not much harder. Both can be done in R with nls(). Andy From: Quin Wills > > I am
2000 Dec 05
1
Is robust regression available in R.
Hello, the R people. I look for robust regression in R. This method is available in S, its name is rreg. Colud anyone teach me ? -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the
2003 Nov 14
1
What goodness-of-fit measure for robust regression ?
Hi, i. After estimating some coefficients using robust regression with rlm() or lqs(), I wonder if there exist some measures of the goodness-of-fit as those for standard linear model(R2)... or evenly if it's a statistics non-sense to look for since I do not find any mention of that in differents chapters on robust and resistant regression or in severals R documentation (Fox, Ripley and
2007 Nov 21
1
equivalent of Matlab robustfit?
Hi, I've been using the Matlab robustfit function for linear regressions where I suspect some data points are outliers. Is there an equivalent function in R? Take care, Darren PS, This is the Matlab help on robustfit: >> help robustfit ROBUSTFIT Robust linear regression B = ROBUSTFIT(X,Y) returns the vector B of regression coefficients, obtained by performing robust
2018 Apr 06
1
Fast tau-estimator line does not appear on the plot
R-experts, I have fitted many different lines. The fast-tau estimator (yellow line) seems strange to me?because this yellow line is not at all in agreement with the other lines (reverse slope, I mean the yellow line has a positive slope and the other ones have negative slope). Is there something wrong in my R code ? Is it because the Y variable is 1 vector and should be a matrix ? Here is the
2006 Feb 21
2
How to get around heteroscedasticity with non-linear least squares in R?
I am using "nls" to fit dose-response curves but am not sure how to approach more robust regression in R to get around the problem of the my error showing increased variance with increasing dose. My understanding is that "rlm" or "lqs" would not be a good idea here. 'Fairly new to regression work, so apologies if I'm missing something obvious.
2018 Mar 31
0
Fast tau-estimator line does ot appear on the plot
On 31/03/2018 11:57 AM, varin sacha via R-help wrote: > Dear R-experts, > > Here below my reproducible R code. I want to add many straight lines to a plot using "abline" > The last fit (fast Tau-estimator, color yellow) will not appear on the plot. What is going wrong ? > Many thanks for your reply. > It's not quite reproducible: you forgot the line to create
2018 Apr 07
0
Fast tau-estimator line does not appear on the plot
You need to pay attention to the documentation more closely. If you don't know what something means, that is usually a signal that you need to study more... in this case about the difference between an input variable and a design (model) matrix. This is a concept from the standard linear algebra formulation for regression equations. (Note that I have never used RobPer, nor do I regularly
2008 May 02
2
my first post to the list
Hello R-listers! My first post to the list is a very simple one for those who use the software continuosly. I am trying to understand the fixed-x resampling and random-x-resampling method proposed by Fox about Bootstrapping. The doubt that I have is on the side of the model run in one of the functions expressed for fixed-x resampling. What I don't understand is: X=model.matrix, and the -1
2005 Dec 08
1
weighted m-estimator
Dear R listers, I'm trying use Huber's m-estimator on a dataset, which works fine so far. In the next step I would like to assign a (frequency) weight to the observations. It seemed straight forward to me to replicate the rows according to their count variable. Unfortunately, a solution provided by jim holtman on Wed 19 Oct 2005 in this list doesn't work for me: > y
2018 Mar 31
2
Fast tau-estimator line does ot appear on the plot
Dear R-experts, Here below my reproducible R code. I want to add many straight lines to a plot using "abline" The last fit (fast Tau-estimator, color yellow) will not appear on the plot. What is going wrong ? Many thanks for your reply. ########## Y=c(2,4,5,4,3,4,2,3,56,5,4,3,4,5,6,5,4,5,34,21,12,13,12,8,9,7,43,12,19,21)
2011 Feb 03
1
"hubers" function in R MASS library : problem and solution
Hello: I found the "hubers" function in MASS library is NOT working on the following data: > a <-
2010 Aug 06
3
m-estimators
Dear colleagues can somebody help me by showing how we can compute m-estimators in R? thanks Dr. Iasonas Lamprianou Assistant Professor (Educational Research and Evaluation) Department of Education Sciences European University-Cyprus P.O. Box 22006 1516 Nicosia Cyprus Tel.: +357-22-713178 Fax: +357-22-590539 Honorary Research Fellow Department of Education The University of Manchester
2007 Nov 29
1
relative importance of predictors
Hei Group, I want to compare the relative importance of predictors in a multiple linear regression y~a+bx1+cx2... However, bptest indicates heteroskedasticity of my model. I therefore perform a robust regression (rlm), in combination with bootstrapping (as outlined in J. Fox, Bootstrapping Regression Models). Now I want to compare the relative importance of my predictors. Can I rely on the