search for: coefficients

Displaying 20 results from an estimated 5032 matches for "coefficients".

2023 Mar 02
1
transform.data.frame() ignores unnamed arguments when no named argument is provided
Thanks and good point about unspecified behavior. The way it behaves now (when it doesn't ignore) is more consistent with data.frame() though so I prefer that to a "warn and ignore" behaviour: data.frame(a = 1, b = 2, 3) #> a b X3 #> 1 1 2 3 data.frame(a = 1, 2, 3) #> a X2 X3 #> 1 1 2 3 (and in general warnings make for unpleasant debugging so I prefer
2006 Aug 15
3
question re: "summarry.lm" and NA values
Is there a way to get the following code to include NA values where the coefficients are ?NA?? ((summary(reg))$coefficients) explanation: Using a loop, I am running regressions on several ?subsets? of ?data1?. ?reg <- ( lm(lm(data1[,1] ~., data1[,2:l])) )? My regression has 10 independent variables, and I therefore expect 11 coefficients. After each regression, I wish to sa...
2023 Mar 02
1
transform.data.frame() ignores unnamed arguments when no named argument is provided
On Thu, Mar 2, 2023 at 2:02?PM Antoine Fabri <antoine.fabri at gmail.com> wrote: > Thanks and good point about unspecified behavior. The way it behaves now > (when it doesn't ignore) is more consistent with data.frame() though so I > prefer that to a "warn and ignore" behaviour: > > data.frame(a = 1, b = 2, 3) > > #> a b X3 > > #> 1 1 2 3
2009 Jun 05
2
p-values from VGAM function vglm
Anyone know how to get p-values for the t-values from the coefficients produced in vglm? Attached is the code and output ? see comment added to output to show where I need p-values + print(paste("********** Using VGAM function gamma2 **********")) + modl2<- vglm(MidPoint~Count,gamma2,data=modl.subset,trace=TRUE,crit="c") + p...
2009 Sep 06
2
How to figure the type of a variable?
Hi, I want to know what is there returned values of 'lm'. 'class' and 'lm' does not show that the returned value has the variable coefficients, etc. I am wondering what is the command to show the detailed information. If possible, I aslo want the lower level information. For example, I want to show that 'coefficients' is a named list and it has 2 elements. Regards, Peng > x=1:10 > y=1:10 > r=lm(x~y) > class(r) [1] &q...
2009 Sep 06
1
How to refer the element in a named list?
Hi, I thought that 'coefficients' is a named list, but I can not refer to its element by something like r$coefficients$y. I used str() to check r. It says the following. Can somebody let me know what it means? ..- attr(*, "names")= chr [1:2] "(Intercept)" "y" $ Rscript lm.R > x=1:10 > y=1...
2009 Apr 05
2
loop problem for extract coefficients
Dear R users, I have problem with extracting coefficients from a object. Here, X (predictor)and Y (response) are two matrix , I am regressing X ( dimensions 10 x 20) on each of columns of Y[,1] (10 x 1) and want to store the coefficient values. I have performed a Elastic Net regression and I want to store the coeffcients in each iteration. I got an error...
2012 Apr 30
0
Extracting coefficients values with bootstrap
...B) for (i in 1:B){ idx = sample(1:N, replace=T) newdata = data_As[idx,] LogRds <- log(newdata$Rds_25k+1) GeoRock <- factor(newdata$GeoRock) data_As.boot = lm(newdata$Log_Level ~ LogRds + GeoRock ) stor.r2[i] = summary(data_As.boot)$r.squared stor.inter[i] = summary(data_As.boot)$coefficients[1,1] stor.Rds[i] = summary(data_As.boot)$coefficients[2,1] stor.Bimod [i] = summary(data_As.boot)$coefficients[3,1] stor.grano[i] = summary(data_As.boot)$coefficients[4,1] stor.inter[i] = summary(data_As.boot)$coefficients[5,1] stor.mafic_vol [i] = summary(data_As.boot)$coefficients[6,1] stor.nonma...
2007 Jun 20
2
Extracting t-tests on coefficients in lm
I am writing a resampling program for multiple regression using lm(). I resample the data 10,000 times, each time extracting the regression coefficients. At present I extract the individual regression coefficients using brg = lm(Newdv~Teach + Exam + Knowledge + Grade + Enroll) bcoef[i,] = brg$coef This works fine. But now I want to extract the t tests on these coefficients. I cannot find how these coefficients are stored, if at all. When I...
2008 Feb 06
2
GLM coefficients
Dear all, After running a glm, I use the summary ( ) function to extract its coefficients and related statistics for further use. Unfortunately, the screen only displays a small (last) part of the results. I tried to overcome the problem by creating/saving an object "coef" for coefficients of the model and export/save it e.g. as a cvs document. While I succed with this operati...
2007 Jun 18
1
psm/survreg coefficient values ?
...nsidered to be censored. Being new to the psm and survreg packages (and to parametric survival modeling) I am not entirely sure how to interpret the coefficient values that psm returns. I have included the following code to illustrate code similar to what I am using on my data. I suppose that the coefficients are somehow rescaled , but I am not sure how to return them to the original scale and make sense out of the coefficients, e.g., estimate the the effect of higher acuity on time to event in minutes. Any explanation or direction on how to interpret the coefficient values would be greatly appreciated...
2013 May 05
1
slope coefficient of a quadratic regression bootstrap
Hello, I want to know if two quadratic regressions are significantly different. I was advised to make the test using step 1 bootstrapping both quadratic regressions and get their slope coefficients. (Let's call the slope coefficient *â*^1 and *â*^2) step 2 use the slope difference *â*^1-*â*^2 and bootstrap the slope coefficent step 3 find out the sampling distribution above and calculate the % =0 step 4 multiple the % by 2 However, I am new to the package boot. I wrote a cod...
2012 Apr 30
3
95% confidence interval of the coefficients from a bootstrap analysis
Hello, I am doing a simple linear regression analysis that includes few variables. I am using a bootstrap analysis to obtain the variation of my variables to replacement. I am trying to obtain the coefficients 95% confidence interval from the bootstrap procedure. Here is my script for the bootstrap: N = length (data_Pb[,1]) B = 10000 stor.r2 = rep(0,B) stor.r2 = rep(0,B) stor.inter = rep(0,B) stor.Ind5 = rep(0,B) stor.LNPRI25 = rep(0,B) stor.NPRI10 = rep(0,B) stor.Mine = rep(0,B) for (i in 1:B...
2010 Mar 22
2
problems extracting parts of a summary object
summary(x), where x is the output of lm, produces the expectedd display, including standard errors of the coefficients. summary(x)$coefficients produces a vector (x is r$individual[[2]]): > r$individual[[2]]$coefficients tX(Intercept) tXcigspmkr tXpeld tXsmkpreve mn -2.449188e+04 -4.143249e+00 4.707007e+04 -3.112334e+01 1.671106e-01 mncigspmkr mnpeld mnsmkpreve 3.580065e+...
2008 Jan 16
1
nlrq coefficients querry
I have been quantreg library for a number of projects but have just hit a snag. I am using nlrq to examine an asymptotic relationship between 2 variables at the 99th percentile. It performs as expected, however when I try to extract the coefficients along with se and significance I am running into problems. The problem is that for the nlrq regression Dat.nlrq, summary(Dat.nlrq) reports a different coefficients table than summary( Dat.nlrq)$coefficients. Below is a series of syntax (mostly from the nlrq sample script) reproducing the error. #...
2008 Mar 07
5
Puzzling coefficients for linear fitting to polynom
Hi, I can not comprehend the linear fitting results of polynoms. For example, given the following data (representing y = x^2): > x <- 1:3 > y <- c(1, 4, 9) performing a linear fit > f <- lm(y ~ poly(x, 2)) gives weird coefficients: > coefficients(f) (Intercept) poly(x, 2)1 poly(x, 2)2 4.6666667 5.6568542 0.8164966 However the fitted() result makes sense: > fitted(f) 1 2 3 1 4 9 This is very confusing. How should one understand the result of coefficients()? Thanks for any tips, Firas. -- Firas Swidan, P...
2008 Mar 05
5
nls: different results if applied to normal or linearized data
Dear all, I did a non-linear least square model fit y ~ a * x^b (a) > nls(y ~ a * x^b, start=list(a=1,b=1)) to obtain the coefficients a & b. I did the same with the linearized formula, including a linear model log(y) ~ log(a) + b * log(x) (b) > nls(log10(y) ~ log10(a) + b*log10(x), start=list(a=1,b=1)) (c) > lm(log10(y) ~ log10(x)) I expected coefficient b to be identical for all three cases. Hoever, using my datas...
2006 Oct 25
3
simplification of code using stamp?
Hi I have the following code which I would like to simplify. Id does linear regressions and returns the r-squares, and the coefficients. It runs slow, as it is doing the regressions for each - is it possible to get the values in a dataframe which looks as follow: expert | xx | seeds | r.squared | slope | intercept Thanks in advance, Rainer library(reshape) rsqs <- as.data.frame( stamp(...
2005 Dec 08
1
logistic regression with constrained coefficients?
...of (d1(x1,y1), ..., dn(xn,yn), x_class != y_class) rows bound together as a data frame (actually I construct it by columns), and then the obvious thing to try was glm(different.class ~ ., family = binomial(), data = distance.frame) The thing is that this gives me both positve and negative coefficients, whereas the linear combination is only guaranteed to be a metric if the coefficients are all non-negative. There are four fairly obvious ways to deal with that: (1) just force the negative coefficients to 0 and hope. This turns out to work rather well, but still... (2) keep all the coefficien...
2007 Aug 28
3
Forcing coefficients in lm object
Dear all, I would like to use predict.lm() with an existing lm object but with new arbitrary coefficients. I modify 'fit$coef' (see example below) "by hand" but the actual model in 'fit' used for prediction does not seem to be altered (although fit$coef is!). Can anyone please help me do this properly? Thanks in advance, J?r?mie > dat <- data.frame(y=c(0,25,32,15),...