similar to: function lm, get back the coefficient

Displaying 20 results from an estimated 2000 matches similar to: "function lm, get back the coefficient"

2005 Dec 08
1
Constraint on coefficient when fitting with lm, glm etc ...
Dear R-users, I would like to know if there is any way to constraints optimized parameters using the function lm, glm or others that are written in the form: Lm( formula, data ...) As I understand, formula are of the type y ~ X1 +X2+ ... Xi (where Y, X1, X2 ..Xi are vectors). In my case I would like the estimates of this linear combination computed with lm to be positives. I haven't found
2008 Oct 04
1
syntax to restrict coefficient in lm()
Hi, I would like to estimate an error correction model with lm() but I don't find the correct syntax for that. The model (leaving out the time indices) looks like: dY = a0 - a1 * (Y - b1*X) + b0*dX + e the problem is the term - a1 * (Y - b1*X). How can I restrict a1 to be the same for both Y and -b1*X ? Thanks for considering my question! All the best, Werner
2004 Feb 06
1
problem to get coefficient from lm()
Dear all, The following is a example that I run and hope to get a linear model. However, I find the lm() can not give correct coefficients for the linear model. I hope it's just my own mistake. Please help. TIA. Regards, Jinsong > x [1] 3.760216 3.997288 3.208872 3.985417 3.265704 3.497505 2.923540 3.193937 [9] 3.102787 3.419574 3.169374 2.928510 3.153821 3.100385 3.768770 3.610583
2011 Aug 03
1
Coefficient names when using lm() with contrasts
Dear R Users, Am using lm() with contrasts as below. If I skip the contrasts() statement, I get the coefficient names to be > names(results$coef) [1] "(Intercept)" "VarAcat" "VarArat" "VarB" which are much more meaningful than ones based on integers. Can anyone tell me how to get R to keep the coefficient names based on the factor levels
2008 Feb 14
2
lm, coefficient 'not defined because of singularities'? What does this mean?
Hello, I'm doing an lm(y1~x1), no NAs in them, both of length 283. I get out however and 'NA' for the estimate of x1 and summary gives: Residuals: Min 1Q Median 3Q Max -0.1998309 -0.0447269 -0.0006252 0.0390933 0.3141687 Coefficients: (1 not defined because of singularities) Estimate Std. Error t value Pr(>|t|) (Intercept) -0.021291 0.003994 -5.331 2.01e-07 *** x1
2000 Sep 26
3
lm -- significance of x coefficient when I(x^2) is used
In "Modern Applied Statistics with S-Plus" 3rd ed., footnote on page 153 regarding a model lm(Gas~Insul/(Temp+I(Temp^2))-1,whiteside), I read "Notice that when the quadratic terms are present, first degree coefficients mean 'the slope of the curve at temperature zero', so a non-significant value does not mean that the linear term is not needed.
2005 Aug 19
1
Using lm coefficients in polyroot()
Dear useRs, I need to compute zero of polynomial function fitted by lm. For example if I fit cubic equation by fit=lm(y~x+I(x^2)+i(x^3)) I can do it simply by polyroot(fit$coefficients). But, if I fit polynomial of higher order and optimize it by stepAIC, I get of course some coefficients removed. Then, if i have model y ~ I(x^2) + I(x^4) i cannot call polyroot in such way, because there is
2006 Apr 24
1
omitting coefficients in summary.lm()
Hi, I'm running a regression using lm(), in which one of the right-hand side variables is factor with many levels (say, 80). I am not intersted in the estimates of the resulting dummies, but I have to include them in my regression equation. So, I don't want the estimates associated with theses dummies to be printed by summary.lm( ). Is there an easy way to do this? Thank you, Dimitri
2006 Jun 02
3
lm() variance covariance matrix of coefficients.
Hi, I am running a simple linear model with (say) 5 independent variables. Is there a simple way of getting the variance-covariance matrix of the coeffcient estimates? None of the values of the lm() seem to provide this. Thanks in advance, Ritwik Sinha rsinha@darwin.cwru.edu Grad Student Case Western Reserve University [[alternative HTML version deleted]]
2010 Apr 29
1
lm() with non-linear coefficients constraints? --- nls?
dear R experts---quick question. I need to estimate a model that looks like y = (b*T+d*T^3) + (1-b-3*d*T^2)*x + (3*d*T)*x^2 + (-d)*x^3 I only have three parameters. Is nls() the right tool for the job, or is there something faster/better? /iaw ---- Ivo Welch (ivo.welch@brown.edu, ivo.welch@gmail.com) [[alternative HTML version deleted]]
2011 May 11
1
displaying derived coefficients in lm
Hello R-help, Is there a way to get R to tell you the coefficients in a lm that it wouldn't normally tell you because of identifiability constraints? For instance, if you use contr.sum() to generate contrasts for a factor, say ## y <- some data ## x <- a factor with levels 1:6 contrasts(x) <- contr.sum(levels(x)) lm.1 <- lm(y ~ x) how would one persuade summary.lm to give the
2007 Aug 24
2
Est of SE of coefficients from lm() function
Dear all R users, Can anyone tell me how I can get estimate of SE of coefficients from, lm() function? I tried following : x = 1:10 lm(x[-1]~x[-10]-1)$coefficients Here I got the est. of coefficient, however I also want to get some "automated" way to get estimate of SE. Regards, [[alternative HTML version deleted]]
2007 Dec 05
1
confint for coefficients from lm model (PR#10496)
Full_Name: Christian Lajaunie Version: 2.5.1 OS: Fedora fc6 Submission from: (NULL) (193.251.63.39) confint() does not use the appropriate variance term when the design matrix contains a zero column (which of course should not happen). Example: A 10x2 matrix with trivial column 1: > junk <- data.frame(x=rep(0,10), u=factor(sample(c("Y", "N"), 10, replace=T))) The
2012 Mar 01
3
Create a function "automatically" from lm formula and coefficients?
I hope the subject says it all. I want to be able to use an lm object and the associated coefficients to create function that can produce "expected" "y" values given inputs. Thanks, KW -- [[alternative HTML version deleted]]
2004 Aug 30
2
after lm-fit: equality of two regression coefficients test
Hi Let's assume, we have a multiple linear regression, such as the one using the Scottish hills data (MASS, data(hills)): one dependent variable: time two independent var (metric): dist, climb if I am interested, after (!) fitting a lm: my. lm <- lm(time ~ dist + climb, data = hills) in the equivalence (or non-equivalence) of the two predictors "dist" and
2011 Aug 23
1
obtaining p-values for lm.ridge() coefficients (package 'MASS')
Dear all I'm familiarising myself with Ridge Regressions in R and the following is bugging me: How does one get p-values for the coefficients obtained from MASS::lm.ridge() output (for a given lambda)? Consider the example below (adapted from PRA [1]): > require(MASS) > data(longley) > gr <- lm.ridge(Employed ~ .,longley,lambda = seq(0,0.1,0.001)) > plot(gr) > select(gr)
2005 Jun 29
1
poly() in lm() leads to wrong coefficients (but correct residuals)
Dear all, I am using poly() in lm() in the following form. 1> DelsDPWOS.lm3 <- lm(DelsPDWOS[,1] ~ poly(DelsPDWOS[,4],3)) 2> DelsDPWOS.I.lm3 <- lm(DelsPDWOS[,1] ~ poly(I(DelsPDWOS[,4]),3)) 3> DelsDPWOS.2.lm3 <- lm(DelsPDWOS[,1]~DelsPDWOS[,4]+I(DelsPDWOS[,4]^2)+I(DelsPDWOS[,4]^3)) 1 and 2 lead to identical but wrong results. 3 is correct. Surprisingly (to me) the residuals
2006 May 18
1
how to get correct coefficients from lm model
Howdy I apologize for duplicated posting. But I decided to correct my previous posting. I had the regression results using r <- lm(Y ~ nemp + as.factor(devt), data=d). First, there is the result of anova(r). Here I could not find regression coefficients. Response: Y Df Sum Sq Mean Sq F value Pr(>F) nemp 1 58.2 58.2 1233.23 < 2e-16 ***
2007 Jun 20
2
Extracting t-tests on coefficients in lm
I am writing a resampling program for multiple regression using lm(). I resample the data 10,000 times, each time extracting the regression coefficients. At present I extract the individual regression coefficients using brg = lm(Newdv~Teach + Exam + Knowledge + Grade + Enroll) bcoef[i,] = brg$coef This works fine. But now I want to extract the t tests on these coefficients. I cannot
2005 Sep 15
1
Coefficients from LM
Hi everyone, Can anyone tell me if its possibility to extract the coefficients from the lm() command? For instance, imagine that we have the following data set (the number of observations for each company is actually larger than the one showed...): Company Y X1 X2 1 y_1 x1_1 x2_1 1 y_2 x1_2 x2_2 1 y_3 x1_3 x2_3 (...) 2 y_4 x1_4 x2_4 2 y_5 x1_5 x2_5 2 y_6 x1_6 x2_6 (...) n y_n x1_n x2_n n