Displaying 20 results from an estimated 10000 matches similar to: "linear regression: evaluating the result Q"
2004 Dec 14
0
linear regression: evaluating the result Q
It looks just like the classical F-test for lack-of-fit, using estimate of
`pure errors' from replicates, doesn't it? This should be in most applied
regression books. The power (i.e., probability of finding lack-of-fit when
it exists) of such tests will depend on the data.
Andy
> From: RenE J.V. Bertin
>
> Hello,
>
> I'd like to come back to this question I
2007 Jul 18
1
lattice plot axis scaling
I want to generate a lattice plot of a multiple linear regression. I'm
using the code:
xyplot(y ~ x1 + x2 | status, data=datam,
xlab="Peak separation",ylab="G/W",main="G/W vs Fuzzy peak
separation: Threshold=1.8",
groups=Fuzzy.gw.t.score>1.8,
subset=(status %in% c("control","patient","sibling")),
2018 Apr 07
0
Fast tau-estimator line does not appear on the plot
You need to pay attention to the documentation more closely. If you don't
know what something means, that is usually a signal that you need to study
more... in this case about the difference between an input variable and a
design (model) matrix. This is a concept from the standard linear algebra
formulation for regression equations. (Note that I have never used RobPer,
nor do I regularly
2005 Jan 17
2
bwplot: how not to draw outliers
RenE J.V. Bertin wrote:
> Hello, and (somewhat belated) best wishes for 2005.
>
> Can one order not to draw outliers in bwplot, or at least exclude them from the vertical axis scaling? If so, how (or what doc do I need to consult)?
> The options that have this effect in boxplot() do not appear to have any effect with bwplot (although outline=FALSE in boxplot does *not* change the
2018 Apr 06
1
Fast tau-estimator line does not appear on the plot
R-experts,
I have fitted many different lines. The fast-tau estimator (yellow line) seems strange to me?because this yellow line is not at all in agreement with the other lines (reverse slope, I mean the yellow line has a positive slope and the other ones have negative slope).
Is there something wrong in my R code ? Is it because the Y variable is 1 vector and should be a matrix ?
Here is the
2007 Nov 28
2
alternatives to traditional least squares method in linear regression ?
Dear list,
I have encountered a special case for searching a linear regression
where I'm not satisfied with the results obtained using the traditional
least squares method (sometimes called OLS) for estimating/optimizing
the residues to the regression line (see code below). Basically, a
group of my x-y data are a bit off the diagonal line (in my case the
diagonal represents the ideal or
2004 Oct 11
3
split and rlm
Hello, I'm trying to do a little rlm of some data that looks like this:
UNIT COHORT perdo adjodds
1010 96 0.39890 1.06894
1010 97 0.48113 1.57500
1010 98 0.36328 1.21498
1010 99 0.44391 1.38608
It works fine like this: rlm(perdo ~ COHORT, psi=psisquare)
But the problem is that I have about 100 UNITs, and I want to do a
2008 Dec 08
1
residual standard error in rlm (MASS package)
Hi,
I would appreciate of someone could explain how the residual standard
error is computed for rlm models (MASS package). Usually, one would
expect to get the residual standard error by
> sqrt(sum((y-fitted(fm))^2)/(n-2))
where y is the response, fm a linear model with an intercept and slope
for x and n the number of observations. This does not seem to work for
rlm models and I am wondering
2004 Sep 17
2
lattice: bwplot and panel.lmline()
On Friday 17 September 2004 13:52, RenE J.V. Bertin wrote:
> Hello again,
>
> I am doing regressions (using panel.lmline() (and panel.abline(
> rlm(...))) ) inside a panel method which I pass to bwplot().
>
> What I would like to do is create a boxplot of categorised data
> (binned on the independent variable), and superpose a regression line
> which is calculated using the
2018 Mar 31
0
Fast tau-estimator line does ot appear on the plot
On 31/03/2018 11:57 AM, varin sacha via R-help wrote:
> Dear R-experts,
>
> Here below my reproducible R code. I want to add many straight lines to a plot using "abline"
> The last fit (fast Tau-estimator, color yellow) will not appear on the plot. What is going wrong ?
> Many thanks for your reply.
>
It's not quite reproducible: you forgot the line to create
2010 Nov 08
1
Add values of rlm coefficients to xyplot
Hello,
I have a simple xyplot with rlm lines.
I would like to add the a and b coefficients (y=ax+b) of the rlm calculation
in each panel.
I know I can do it 'outside' the xyplot command but I would like to do all
at the same time.
I found some posts with the same question, but no answer.
Is it impossible ?
Thanks in advance for your help.
Ptit Bleu.
x11(15,12)
xyplot(df1$col2 ~
2004 Nov 18
4
Re: changing (core) function argument defaults?
>From: Patrick Connolly <p.connolly@hortresearch.co.nz>
>To: "RenE J.V. Bertin" <rjvbertin@hotmail.com>
>Subject: Re: [R] changing (core) function argument defaults?
>Date: Thu, 18 Nov 2004 11:43:10 +1300
>
>On Wed, 20-Oct-2004 at 07:48PM +0200, RenE J.V. Bertin wrote:
>
>|> Hello,
2004 Nov 18
4
Re: changing (core) function argument defaults?
>From: Patrick Connolly <p.connolly@hortresearch.co.nz>
>To: "RenE J.V. Bertin" <rjvbertin@hotmail.com>
>Subject: Re: [R] changing (core) function argument defaults?
>Date: Thu, 18 Nov 2004 11:43:10 +1300
>
>On Wed, 20-Oct-2004 at 07:48PM +0200, RenE J.V. Bertin wrote:
>
>|> Hello,
2004 Jun 11
1
comparing regression slopes
Dear List,
I used rlm to calculate two regression models for two data sets (rlm
due to two outlying values in one of the data sets). Now I want to
compare the two regression slopes. I came across some R-code of Spencer
Graves in reply to a similar problem:
http://www.mail-archive.com/r-help at stat.math.ethz.ch/msg06666.html
The code was:
> df1 <- data.frame(x=1:10, y=1:10+rnorm(10))
2005 Feb 15
2
summary(aov(...)) into a string?
It doesn't print anything: the summary.aov (or summary.aovlist)
print method does.
?summary.aov tells you the structure of the objects they return.
On Tue, 15 Feb 2005, RenE J.V. Bertin wrote:
> I'd like to annotate a plot with the output of summary(aov(model)),
> ideally just with the significant effects. I don't find a means to
> redirect what that command prints into
2003 Nov 19
2
as.double( factor(something) )??
It's in the FAQ, Q7.12.
On Wed, 19 Nov 2003, RenE J.V. Bertin wrote:
> After converting a numeric variable into a factor, is there a way to convert it back to the original values? as.double() doesn't do that correctly, for evident reasons (I guess) and as shown below.
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics,
2004 Sep 17
1
controlling printing precision in paste()
Rene,
Look at ?format.
Sean
On Sep 17, 2004, at 9:21 AM, RenE J.V. Bertin wrote:
> Hello,
>
> I can't seem to find the way to modify the precision with which
> paste() prints its floating point numbers, more precisely the number
> of decimal digits printed. This is apparently not controlled by
> options( digits= ), and there is no appropriate argument to paste()
>
2012 May 21
1
M-estimation in multivariate linear regression model in R
Hello,
I try to find a function for M-estimation in multivariate linear regression
model (function that can estimate betas in my model: y=x * beta + e, where
y is a matrix). I´ve searched R-site for a long time, but I am hopeless.
I would like to ask, if there is any function for M-estimation in
multivariate linear regression model in R. I know I can estimate betas in
my model by rlm() function
2004 Oct 10
3
some help interpreting ANOVA results, please?
On Sun, 10 Oct 2004, RenE J.V. Bertin wrote:
> Could I ask some hints/help in interpreting the following ANOVA results,
> please? This concerns an experiment where I study the incidence and
> severity of motion sickness. I have Sickness.norm, a subjective
> discomfort/sickness estimate, normalised to 0..1, the session time T
> (normalised to 0..1 and binned in 0.2 wide bins) and a
2007 Sep 04
1
Robust linear models and unequal variance
Hi all,
I have probably a basic question, but I can't seem to find the answer in
the literature or in the R-archives.
I would like to do a robust ANCOVA (using either rlm or lmRob of the
MASS and robust packages) - my response variable deviates slightly from
normal and I have some "outliers". The data consist of 2 factor
variables and 3-5 covariates (fdepending on the model).