similar to: Multiple Regression

Displaying 20 results from an estimated 70000 matches similar to: "Multiple Regression"

2012 Jul 27
1
Understanding the intercept value in a multiple linear regression with categorical values
Hi! I'm failing to understand the value of the intercept value in a multiple linear regression with categorical values. Taking the "warpbreaks" data set as an example, when I do: > lm(breaks ~ wool, data=warpbreaks) Call: lm(formula = breaks ~ wool, data = warpbreaks) Coefficients: (Intercept) woolB 31.037 -5.778 I'm able to understand that the value of
2006 Aug 31
0
Pretty-printing multiple regression models
A few days ago, I had asked this question. Consider this situation: > x1 <- runif(100); x2 <- runif(100); y <- 2 + 3*x1 - 4*x2 + rnorm(100) > m1 <- summary(lm(y ~ x1)) > m2 <- summary(lm(y ~ x2)) > m3 <- summary(lm(y ~ x1 + x2)) You have estimated 3 different "competing" models, and suppose you want to present the set of models in one table. xtable(m1) is
2008 Feb 03
1
Effect size of comparison of two levels of a factor in multiple linear regression
Dear R users, I have a linear model of the kind outcome ~ treatment + covariate where 'treatment' is a factor with three levels ("0", "1", and "2"), and the covariate is continuous. Treatments "1" and "2" both have regression coefficients significantly different from 0 when using treatment contrasts with treatment "0" as the
2005 Sep 01
0
Robust Regression - LTS
Hi, I am using robust regression, i.e. model.robust<-ltsreg(MXD~ORR,data=DATA). My question:- is there any way to determine the Robust Multiple R-Squared (as returned in the summary output in splus)? I found an equivalent model in the rrcov package which included R-square, residuals etc in it's list of components, but when I used this package the only results returned were equivalent to
2010 Sep 24
0
multivariate multiple regression coefficient
hi, I considered multivariate multiple regression for respons, predictor, each 5 dimension. on going process, through initb <- eigen(temp)$vectors[,1:3] temp <- vhalf%*%Kproduct(Ir,initb) (herein 'vhalf' is root for inverse of covariance matrix and 'Kproduct' is Kronecker product) and then process regression. However, I have tried using the
2009 Dec 17
2
Testing equality of regression model on multiple groups
Hello, I'm trying to test for the joint equality of coefficients of the same model across different subsets of the data (ie, is it necessary to estimate the same model on these different populations, or can I just estimate the model once on the whole dataset?). My plan is to use the F-test on the reduced model and the full model. By full model, I mean a specification that mimics my
2012 Jul 13
1
Accessing coefficient values in linear regression
Hi everyone, I am fitting a simple linear regression model in R. My command is j=lm( Y ~ Sex + begsal + time + int) Call: lm(formula = Y ~ Sex + begsal + time + int) Coefficients: (Intercept) Sex begsal time int 191.916 -241.805 3.969 5.003 3.040 Now I wish to access the values of these coefficients for other purposes
2011 Sep 19
1
regression summary results pvalues and coefficients into a excel
Hi All, I have run many regression analyses (14000 +) and want to collect the coefficients and pvalues into an excel file. I can get the statements below to work up to step 4. I can printout the regressionresults (sample output below). So my hope is to run something like step 5 and 6 and put the pvalues (and then coefficients) into an excel file. Can anyone suggest what I am doing wrong or a
2009 Jul 14
1
Interaction term in multiple regression
Hello All, Thank you for taking my question. I am looking for information on how R handles interaction terms in a multiple regression using the ?lm? command. I originally noticed something was unusual when my R output did not match the output from JMP for an identical test run previously. Both programs give identical results for the main test and if the models do not contain the interaction
2011 Feb 07
0
FW: multivariate regression
The test is manova. I tried to use manova() function, I used the code below:fit <- manova(Y ~ X)summary(fit, test="Wilks")but I get p values for intercept and regression coefficient as in anova() function, not for the hull model. Date: Mon, 7 Feb 2011 00:57:43 -0800 Subject: Re: [R] FW: multivariate regression From: djmuser@gmail.com To: denizsigirli@hotmail.com CC:
2006 Jan 11
0
Obtaining the adjusted r-square given the regression coef ficients
Hello Alexandra, R2 is only defined for regressions with intercept. See a decent econometrics textbook for its derivation. HTH, Bernhard -----Urspr??ngliche Nachricht----- Von: Alexandra R. M. de Almeida [mailto:alexandrarma at yahoo.com.br] Gesendet: Mittwoch, 11. Januar 2006 03:48 An: r-help at stat.math.ethz.ch Betreff: [R] Obtaining the adjusted r-square given the regression coefficients
2011 Sep 12
2
Multiple regression intercept
Hi I am having difficulty interpretive the multiple regression output. I would like to know what it means when one of the factors is assigned as the intercept? In my data I am looking at the relationship between environmental parameters and biological production. One of my variables in the analysis is substratum type and gravel is identified as the intercept and the P-value is significant,...
2006 May 18
1
how to get coefficients of regression or Anova
Hi R Gurus! I conducted regression and anova followings : Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 6.07e-01 5.95e-02 10.19 < 2e-16 *** nemp 2.87e-06 1.04e-07 27.63 < 2e-16 *** as.factor(corridor1)A -8.81e-02 2.13e-02 -4.14 3.6e-05 *** as.factor(corridor1)B
2010 Feb 22
4
Alternatives to linear regression with multiple variables
I wonder if someone can give some pointers on alternatives to linear regression (e.g. Loess) when dealing with multiple variables. Taking any simple table with three variables, you can very easily get the intercept and coefficients with: summary(lm(read_table)) For obvious reasons, the coefficients in a multiple regression are quite different from what you get if you calculate regressions for
2007 Nov 28
2
fit linear regression with multiple predictor and constrained intercept
Hi group, I have this type of data x(predictor), y(response), factor (grouping x into many groups, with 6-20 obs/group) I want to fit a linear regression with one common intercept. 'factor' should only modify the slopes, not the intercept. The intercept is expected to be >0. If I use y~ x + factor, I get a different intercept for each factor level, but one slope only if I use y~ x *
2012 Mar 21
0
multivariate ordinal probit regression vglm()
Hello, all. I'm investigating the rate at which skeletal joint surfaces pass through a series of ordered stages (changes in morphology). Current statistical methods in this type of research use various logit or probit regression techniques (e.g., proportional odds logit/probit, forward/backward continuation ratio, or restricted/unrestricted cumulative probit). Data typically include the
2012 Mar 16
1
multivariate regression and lm()
Hello, I would like to perform a multivariate regression analysis to model the relationship between m responses Y1, ... Ym and a single set of predictor variables X1, ..., Xr. Each response is assumed to follow its own regression model, and the error terms in each model can be correlated. Based on my readings of the R help archives and R documentation, the function lm() should be able to
2007 Feb 08
0
How to get p-values, seperate vectors of regression coefficients and their s.e. from the "yags" output?
Hello R-users: I am using "yags" for fitting GEE which is giving me the same result as "Proc GENMOD". Now I have couple of questions related to yags output. (By the way, someone told me to run the geeglm for the same analysis and I did run but did not get the same result as of genmod and don't know how to correct the geeglm codes so that all three will be same!)
2007 Nov 13
1
logistic regression model specification
Hi, I have setup a simple logistic regression model with the glm() function, with the follow formula: y ~ a + b where: 'a' is a continuous variable stratified by the levels of 'b' Looking over the manual for model specification, it seems that coefficients for unordered factors are given 'against' the first level of that factor. This makes for difficult
2012 Sep 06
0
Logit regression, I observed different results for glm or lrm (Design) for ordered factor variables
Dear useR's, I was comparing results for a logistic regression model between different library's. themodel formula is arranged as follows: response ~ (intercept) + value + group OR: glm( response ~ (intercept) + value + group , family=binomial(link='logit')) lrm( response ~ (intercept) + value + group ) ROC( from = response ~ (intercept) + value + group ,