similar to: comparing regression slopes

Displaying 20 results from an estimated 3000 matches similar to: "comparing regression slopes"

2005 Mar 27
1
p values when using rlm
R 2.0.1 Linux I am using rlm() to fit a model, e.g. fit1<-rlm(y~x). My model is more complex than the one shown. When I enter summary(fit1) I get estimates for the model's coefficients along with their SEs, and t values, but no p values. The p value column is blank. Similarly, when I enter anova(fit1) I get DF, Sum Sq, Mean Sq, but the column for F value and Pr(>F) are blank. Any
2007 Jun 07
3
rlm results on trellis plot
How do I add to a trellis plot the best fit line from a robust fit? I can use panel.lm to add a least squares fit, but there is no panel.rlm function. -- Alan S Barnett <asb at mail.nih.gov> NIMH/CBDB
2004 Feb 19
1
Comparing two regression slopes
I would suggest the method of Sokal and Rholf (1995) S. 498, using the F test. Below I repeat the analysis by Spencer Graves: Spencer: > df1 <- data.frame(x=1:3, y=1:3+rnorm(3)) > df2 <- data.frame(x=1:3, y=1:3+rnorm(3)) > fit1 <- lm(y~x, df1) > s1 <- summary(fit1)$coefficients > fit2 <- lm(y~x, df2) > s2 <- summary(fit2)$coefficients > db <-
2003 Jul 30
2
Comparing two regression slopes
Hello, I've written a simple (although probably overly roundabout) function to test whether two regression slope coefficients from two linear models on independent data sets are significantly different. I'm a bit concerned, because when I test it on simulated data with different sample sizes and variances, the function seems to be extremely sensitive both of these. I am wondering if
2009 Jan 27
3
How to compare two regression line slopes
Hi, I've made a research about how to compare two regression line slopes (of y versus x for 2 groups, "group" being a factor ) using R. I knew the method based on the following statement : t = (b1 - b2) / sb1,b2 where b1 and b2 are the two slope coefficients and sb1,b2 the pooled standard error of the slope (b) which can be calculated in R this way: > df1 <-
2009 Feb 12
0
Comparing slopes in two linear models
Hi everyone, I have a data frame (d), wich has the results of mosquitoes trapping in three different places. I suspect that one of these places (Local=='Palm') is biased by low numbers and will yield slower slopes in the variance-mean regression over the areas. I wonder if these slopes are diferents. I've looked trought the support list for methods for comparing slopes and found the
2009 Apr 08
2
Null-Hypothesis
Hello R users, I've used the following help two compare two regression line slopes. Wanted to test if they differ significantly: Hi, I've made a research about how to compare two regression line slopes (of y versus x for 2 groups, "group" being a factor ) using R. I knew the method based on the following statement : t = (b1 - b2) / sb1,b2 where b1 and b2 are the two slope
2011 Mar 25
2
A question on glmnet analysis
Hi, I am trying to do logistic regression for data of 104 patients, which have one outcome (yes or no) and 15 variables (9 categorical factors [yes or no] and 6 continuous variables). Number of yes outcome is 25. Twenty-five events and 15 variables mean events per variable is much less than 10. Therefore, I tried to analyze the data with penalized regression method. I would like please some of the
2011 Dec 05
1
about interpretation of anova results...
quantreg package is used. *fit1 results are* Call: rq(formula = op ~ inp1 + inp2 + inp3 + inp4 + inp5 + inp6 + inp7 + inp8 + inp9, tau = 0.15, data = wbc) Coefficients: (Intercept) inp1 inp2 inp3 inp4 inp5 -0.191528450 0.005276347 0.021414032 0.016034803 0.007510343 0.005276347 inp6 inp7 inp8 inp9 0.058708544
2008 Jan 05
1
Likelihood ratio test for proportional odds logistic regression
Hi, I want to do a global likelihood ratio test for the proportional odds logistic regression model and am unsure how to go about it. I am using the polr() function in library(MASS). 1. Is the p-value from the likelihood ratio test obtained by anova(fit1,fit2), where fit1 is the polr model with only the intercept and fit2 is the full polr model (refer to example below)? So in the case of the
2011 Oct 06
1
anova.rq {quantreg) - Why do different level of nesting changes the P values?!
Hello dear R help members. I am trying to understand the anova.rq, and I am finding something which I can not explain (is it a bug?!): The example is for when we have 3 nested models. I run the anova once on the two models, and again on the three models. I expect that the p.value for the comparison of model 1 and model 2 would remain the same, whether or not I add a third model to be compared
2017 Dec 20
1
Nonlinear regression
You also need to reply-all so the mailing list stays in the loop. -- Sent from my phone. Please excuse my brevity. On December 19, 2017 4:00:29 PM PST, Timothy Axberg <axbergtimothy at gmail.com> wrote: >Sorry about that. Here is the code typed directly on the email. > >qe = (Qmax * Kl * ce) / (1 + Kl * ce) > >##The data >ce <- c(15.17, 42.15, 69.12, 237.7, 419.77)
2004 Dec 20
2
problems with limma
I try to send this message To Gordon Smyth at smyth at vehi,edu.au but it bounced back, so here it is to r-help I am trying to use limma, just downloaded it from CRAN. I use R 2.0.1 on Win XP see the following: > library(RODBC) > chan1 <- odbcConnectExcel("D:/Data/mgc/Chips/Chips4.xls") > dd <- sqlFetch(chan1,"Raw") # all data 12000 > # > nzw <-
2008 Apr 17
1
survreg() with frailty
Dear R-users, I have noticed small discrepencies in the reported estimate of the variance of the frailty by the print method for survreg() and the 'theta' component included in the object fit: # Examples in R-2.6.2 for Windows library(survival) # version 2.34-1 (2008-03-31) # discrepancy fit1 <- survreg(Surv(time, status) ~ rx + frailty(litter), rats) fit1 fit1$history[[1]]$theta
2011 Jan 05
1
Comparing fitting models
Dear all, I have 3 models (from simple to complex) and I want to compare them in order to see if they fit equally well or not. From the R prompt I am not able to see where I can get this information. Let´s do an example: fit1<- lm(response ~ stimulus + condition + stimulus:condition, data=scrd) #EQUIVALE A lm(response ~ stimulus*condition, data=scrd) fit2<- lm(response ~ stimulus +
2012 Nov 08
2
Comparing nonlinear, non-nested models
Dear R users, Could somebody please help me to find a way of comparing nonlinear, non-nested models in R, where the number of parameters is not necessarily different? Here is a sample (growth rates, y, as a function of internal substrate concentration, x): x <- c(0.52, 1.21, 1.45, 1.64, 1.89, 2.14, 2.47, 3.20, 4.47, 5.31, 6.48) y <- c(0.00, 0.35, 0.41, 0.49, 0.58, 0.61, 0.71, 0.83, 0.98,
2011 Jan 26
2
Extracting the terms from an rpart object
Hello all, I wish to extract the terms from an rpart object. Specifically, I would like to be able to know what is the response variable (so I could do some manipulation on it). But in general, such a method for rpart will also need to handle a "." case (see fit2) Here are two simple examples: fit1 <- rpart(Kyphosis ~ Age + Number + Start, data=kyphosis) fit1$call fit2 <-
2009 Jul 28
2
A hiccup when using anova on gam() fits.
I stumbled across a mild glitch when trying to compare the result of gam() fitting with the result of lm() fitting. The following code demonstrates the problem: library(gam) x <- rep(1:10,10) set.seed(42) y <- rnorm(100) fit1 <- lm(y~x) fit2 <- gam(y~lo(x)) fit3 <- lm(y~factor(x)) print(anova(fit1,fit2)) # No worries. print(anova(fit1,fit3)) # Likewise. print(anova(fit2,fit3)) #
2011 Sep 07
2
reporting ANOVA for nested models
I have the following results for an ANOVA comparing two nested models. I wasn't sure how I am supposed to report this result in the area of psychology. Specifically, am I supposed to report the DF's or just the F ratio? I could manually calculate the degrees of freedoms, but there must be a reason why R does not give this information, i.e. those are not conventionally used in the
2010 Sep 21
1
package gbm, predict.gbm with offset
Dear all, the help file for predict.gbm states that "The predictions from gbm do not include the offset term. The user may add the value of the offset to the predicted value if desired." I am just not sure how exactly, especially for a Poisson model, where I believe the offset is multiplicative ? For example: library(MASS) fit1 <- glm(Claims ~ District + Group + Age +