Displaying 20 results from an estimated 7000 matches similar to: "regression for several responses"
2003 Jun 12
2
car package dependencies
Hello,
I tried to install the "car" package but I can't solve the dependencies. car
needs grid, lattice and dr but when I try to install grid I get this error
prompt:
wz3x64:/home/oggi/R/software/contrib # rpm -i R-car-1.0.R3-1.i386.rpm
error: failed dependencies:
R-grid is needed by R-car-1.0.R3-1
R-lattice is needed by R-car-1.0.R3-1
2009 Jul 16
1
Theil test help
Hello,
I have a series of questions that I hope will be simple to answer. Basically I would like a code to do the following so that I can compute the distribution free test for the slope of a postulated regression line (Theil test). As I am testing the null hypothesis that slope = 0 against the general alternative the slope does not equal 0, it should be pretty straight-forward.
I have a data
2003 Jun 13
1
lars - lasso problem
hello
I tried to use lars() but neither with my own data nor with the sample data it
works. I get in both cases the following error prompt:
> data(diabetes)
> par(mfrow=c(2,2))
> attach(diabetes)
> x<-lars(x,y)
Error in one %*% x : requires numeric matrix/vector arguments
> x<-lars(x,y, type="lasso")
Error in one %*% x : requires numeric matrix/vector arguments
2011 Oct 24
1
using predict.lm() within a function
I've written a simple function to draw a regression line in a plot and
annotate the line showing the slope
with a label. It works, as I'm using it, when the horizontal variable
is 'x', but gives incorrect results otherwise.
What's wrong?
# simple function to show the slope of a line
show.beta <- function(model, x="x", x1, x2, label, col="black", ...)
2016 Aug 07
1
problem with abine(lm(...)) for plot(y~x, log='xy')
Hello:
In the following plot, the fitted line plots 100 percent above
the points:
tstDat <- data.frame(x=10^(1:3), y=10^(1:3+.1*rnorm(3)))
tstFit <- lm(log(y)~log(x), tstDat)
plot(y~x, tstDat, log='xy')
abline(tstFit)
I can get the correct line with the following:
tstPredDat <- data.frame(x=10^seq(1, 3, len=2))
tstPred <- predict(tstFit, tstPredDat)
2003 Sep 04
1
error in lm.fit
Hello R user,
I have several data frames with >100 columns and I did a linear regression
over time of each column
df1.lm <- lapply(df1, function(x) lm(x~year)$coeff[2])
that worked fine and I get slope of each column oder time - until I divided
df1 by df2
df3 <- df1/df2
> df3.lm <- lapply(df3, function(x) lm(x~year)$coeff[2])
Error in lm.fit(x, y, offset = offset, ...) :
2011 Sep 16
3
Help writing basic loop
Hello,
I would like to write a loop to 1) run 100 linear regressions, and 2)
compile the slopes of all regression into one vector. Sample input data
are:
y1<-rnorm(100, mean=0.01, sd=0.001)
y2<-rnorm(100, mean=0.1, sd=0.01)
x<-(c(10,400))
#I have gotten this far with the loop
for (i in 1:100) {
#create the linear model for each data set
model1<-lm(c(y1[i],y2[i])~x)
2003 Jul 21
5
how to test whether two slopes are sign. different?
Not really r-specific:
Z = (b1 - b2) / SQRT ( SEb1^2 + SEb2^2)
-------Original Message-------
From: Gijsbert Stoet <stoet at volition.wustl.edu>
Sent: 07/20/03 09:51 PM
To: r-help at stat.math.ethz.ch
Subject: [R] how to test whether two slopes are sign. different?
>
> Hi,
suppose I do want to test whether the slopes (e.g. determined with
lsfit) of two different population are
2010 Sep 13
2
Homogeneity of regression slopes
Hello,
We've got a dataset with several variables, one of which we're using
to split the data into 3 smaller subsets. (as the variable takes 1 of
3 possible values).
There are several more variables too, many of which we're using to fit
regression models using lm. So I have 3 models fitted (one for each
subset of course), each having slope estimates for the predictor
variables.
2004 Mar 29
2
Confidence Intervals for slopes
Hi,
I'm trying to get confidence intervals to slopes from a linear model
and I can't figure out how to get at them. As a cut 'n' paste example:
#################
# dummy dataset - regression data for 3 treatments, each treatment with
different (normal) variance
x <- rep(1:10, length=30)
y <- 10 - (rep(c(0.2,0.5,0.8), each=10)*x)+c(rnorm(10, sd=0.1),
rnorm(10,
2009 Apr 08
2
Null-Hypothesis
Hello R users,
I've used the following help two compare two regression line slopes.
Wanted to test if they differ significantly:
Hi,
I've made a research about how to compare two regression line slopes
(of y versus x for 2 groups, "group" being a factor ) using R.
I knew the method based on the following statement :
t = (b1 - b2) / sb1,b2
where b1 and b2 are the two slope
2009 Mar 13
2
Question on summing rows within nested variable
Hi,
I was hoping someone could help figure out how to write code for R to do the below.
I have data that looks like below. Variables, sid and pid are strings, slope is numeric. I need R to get me the mean of slopes for all pid's nested within each sid if there are more than one pid's nested within sid.
If there is only pid for a sid, like for 2.1 below, I want R to write a 0.
In the
2010 Nov 06
1
SMATR common slopes test
Hi All,
I am confused with SMATR's test for common slope. My null hypothesis here is
that all slopes are parallel (common slopes?), right?
So if I get a p value < 0.05 means that we can have confidence to reject it?
That slopes are different?
Or the other way around? it means that we have statistical confidence that
the slopes are parallel?
thanks
--
Eugenio Larios
PhD Student
University
2008 Dec 09
3
Significance of slopes
Hello R community,
I have a question regarding correlation and regression analysis. I have
two variables, x and y. Both have a standard deviation of 1; thus,
correlation and slope from the linear regression (which also must have
an intercept of zero) are equal.
I want to probe two particular questions:
1) Is the slope significantly different from zero? This should be easy
with the lm
2008 Mar 05
1
testing for significantly different slopes
Hi,
How would one go about determining if the slope terms from an analysis of
covariance model are different from eachother?
Based on the example from MASS:
library(MASS)
# parallel slope model
l.para <- lm(Temp ~ Gas + Insul, data=whiteside)
# multiple slope model
l.mult <- lm(Temp ~ Insul/Gas -1, data=whiteside)
# compare nested models:
anova(l.para, l.mult)
Analysis of Variance
2008 May 16
1
Making slope coefficients ``relative to 0''.
I am interested in whether the slopes in a linear model are different
from 0.
I.e. I would like to obtain the slope estimates, and their standard
errors,
``relative to 0'' for each group, rather than relative to some baseline.
Explicitly I would like to write/represent the model as
y = a_i + b_i*x + E
i = 1, ..., K, where x is a continuous variate and i indexes groups
(levels of a
2007 Aug 27
2
validate (package Design): error message "subscript out of bounds"
Dear R users
I use Windows XP, R2.5.1 (I have read the posting guide, I have
contacted the package maintainer first, it is not homework).
In a research project on renal cell carcinoma we want to compute
Harrell's c index, with optimism correction, for a multivariate
Cox regression and also for some univariate Cox models.
For some of these univariate models I have encountered an error
2006 May 19
1
Weird LM behaviour
Dear R users,
experimenting with the lm function in R, I've encountered some
behaviour I don't understand with my limited knowledge of regression.
I made a data-'set' of three measurements (see syntax below). Using
lm (linear model) to fit the regression-line, I expected to find an
intercept of 2.0 and a slope of 0, but in fact the slope is slightly
below zero. Amazed by
2000 Oct 09
4
lm question
I have not really used lm before and I was hoping for some help on a
simple problem.
Here is a toy version of the problem I want to solve.
y x grp
-.9 1 a
-.8 2 a
-.7 3 a
-.7 1.5 b
-.5 2.5 b
-.3 3.5 b
-.19 2.7 c
-.11 3.7 c
-.41 4.7 c
I want to fit a model that has one y-intercept and three slopes, one for
2006 Nov 01
1
Compare linear regressios for significant differences of the slopes
Hi
I have (8 measures * 96 groups) = 768 datasets for which I did linear
regressions using lm().
Now I want to compare the slopes for each of the 8 measures in each of
the 96 groups. As I understand , I can not use
> anova(lm1, ..., lm8)
as the lm1 ... lm8 are based on different datasets.
I also read in previous discussions in this list, that I can see if the
slope +- stddev(slope)