similar to: inconsistency or bug in coef() (PR#9358)

Displaying 20 results from an estimated 40000 matches similar to: "inconsistency or bug in coef() (PR#9358)"

2004 Dec 29
1
Discrepancy between intervals.lme and coef.lme
I'm using R on Windows v2.0.1 with the nlme package (v3.1-53) and am finding some unexpected discrepancies in the output of intervals.lme and coef.lme. I've included a toy dataset at the end, but briefly, the data are longitudinal data from couples in marital therapy. Each spouse's relationship satisfaction is measured 4 times; I've fit both linear and quadratic models to the
2004 Sep 27
1
Funny behaviour of coef() and vcov() if X is singular
coef() and vcov() have different dimensions if a model contains alised parameters as the following example illustrates. A search on "alised" gave noting as far as I could see. Is this a known bug? Bendix C ---------------------- Bendix Carstensen Senior Statistician Steno Diabetes Center Niels Steensens Vej 2 DK-2820 Gentofte Denmark tel: +45 44 43 87 38 mob: +45 30 75 87 38 fax: +45
2006 Aug 31
1
NaN when using dffits, stemming from lm.influence call
Hi all I'm getting a NaN returned on using dffits, as explained below. To me, there seems no obvious (or non-obvious reason for that matter) reason why a NaN appears. Before I start digging further, can anyone see why dffits might be failing? Is there a problem with the data? Consider: # Load data dep <-
2017 Sep 13
3
vcov and survival
Dear Terry, Even the behaviour of lm() and glm() isn't entirely consistent. In both cases, singularity results in NA coefficients by default, and these are reported in the model summary and coefficient vector, but not in the coefficient covariance matrix: ---------------- > mod.lm <- lm(Employed ~ GNP + Population + I(GNP + Population), + data=longley) >
2017 Sep 14
6
vcov and survival
>>>>> Martin Maechler <maechler at stat.math.ethz.ch> >>>>> on Thu, 14 Sep 2017 10:13:02 +0200 writes: >>>>> Fox, John <jfox at mcmaster.ca> >>>>> on Wed, 13 Sep 2017 22:45:07 +0000 writes: >> Dear Terry, >> Even the behaviour of lm() and glm() isn't entirely consistent. In both cases,
2017 Nov 02
2
vcov and survival
>>>>> Fox, John <jfox at mcmaster.ca> >>>>> on Thu, 14 Sep 2017 13:46:44 +0000 writes: > Dear Martin, I made three points which likely got lost > because of the way I presented them: > (1) Singularity is an unusual situation and should be made > more prominent. It typically reflects a problem with the > data or the
2003 Jan 20
1
quadratic trends and changes in slopes
I'd like to use linear and quadratic trend analysis in order to find out a change in slope. Basically, I need to solve a similar problem as discussed in http://www.gseis.ucla.edu/courses/ed230bc1/cnotes4/trend1.html My subjects have counted dots: one dot, two dots, etc. up to 9 dots. The reaction time increases with increasing dots. The theory is that 1 up to 3 or 4 points can be counted
2011 Nov 09
4
Interpreting Multiple Linear Regression Summary
I would appreciate pointers on what I should read to understand this output: summary(lm(TDS ~ Cond + Ca + Cl + Mg + Na + SO4)) Call: lm(formula = TDS ~ Cond + Ca + Cl + Mg + Na + SO4) Residuals: ALL 1 residuals are 0: no residual degrees of freedom! Coefficients: (6 not defined because of singularities) Estimate Std. Error t value Pr(>|t|) (Intercept) 125 NA
2009 Aug 02
3
two-factor linear models with missing cells
I am wondering how to interpret the parameter estimates that lm() reports in this sort of situation: y = round(rnorm(n=24,mean=5,sd=2),2) A = gl(3,2,24,labels=c("one","two","three")) B = gl(4,6,24,labels=c("i","ii","iii","iv")) # Make both observations for A=1, B=4 missing y[19] = NA y[20] = NA data.frame(y,A,B) nonadd = lm(y ~
2008 Feb 03
1
Effect size of comparison of two levels of a factor in multiple linear regression
Dear R users, I have a linear model of the kind outcome ~ treatment + covariate where 'treatment' is a factor with three levels ("0", "1", and "2"), and the covariate is continuous. Treatments "1" and "2" both have regression coefficients significantly different from 0 when using treatment contrasts with treatment "0" as the
2005 Dec 06
1
standardized residuals (rstandard & plot.lm) (PR#8367)
Full_Name: Heather Turner Version: 2.2.0 OS: Windows XP Submission from: (NULL) (137.205.240.44) Standardized residuals as calculated by rstandard.lm, rstandard.glm and plot.lm are Inf/NaN rather than zero when the un-standardized residuals are zero. This causes plot.lm to break when calculating 'ylim' for any of the plots of standardized residuals. Example:
2009 Jan 20
1
inconsistent lm results with fixed response variable
Hi, I'm analyzing a large number of simulations using lm(), a sample of the resulting data is pasted below. In some simulations, the response variable doesn't vary, ie: > tmp[[2]]$richness [1] 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 When I analyze this using R version 2.8.0 (2008-10-20) on a linux cluster, I get an appropriate result: ## begin R
2012 Oct 03
1
will 9 data points work for a regression in R?
See error message below: can someone please help with this? Thanks! Residuals: ALL 9 residuals are 0: no residual degrees of freedom! Residual standard error: NaN on 0 degrees of freedom Multiple R-squared: 1, Adjusted R-squared: NaN F-statistic: NaN on 8 and 0 DF, p-value: NA ______________________________________________________________________ The information transmitted,
2013 Feb 14
3
Problems plotting and regression w.r.t. date data type on x axis
Hello, probably a newbie question, but i didnt find any information on plotting/regressing w.r.t. a date data type. My trials were unfruitful. Can anyone help ? Thanks in advance! Here is my interaction with R: > tabelle date number date2 1 2009-01-1 1673 2009-01-01 2 2009-12-1 2111 2009-12-01 3 2010-7-1 2487 2010-07-01 4 2013-2-1 4301 2013-02-01 > regression.punkte<-lm(tabelle$number
2011 Mar 14
3
Standardized Pearson residuals
Is there any reason that rstandard.glm doesn't have a "pearson" option? And if not, can it be added? Background: I'm currently teaching an undergrad/grad-service course from Agresti's "Introduction to Categorical Data Analysis (2nd edn)" and deviance residuals are not used in the text. For now I'll just provide the students with a simple function to use, but I
2006 Apr 28
4
stepwise regression
Dear all, I have encountered a problem when perform stepwise regression. The dataset have more 9 independent variables, but 7 observation. In R, before performing stepwise, a lm object should be given. fm <- lm(y ~ X1 + X2 + X3 + X11 + X22 + X33 + X12 + X13 + X23) However, summary(fm) will give: Residual standard error: NaN on 0 degrees of freedom Multiple R-Squared: 1, Adjusted
2012 Mar 31
2
lm no calcula un coeficiente
Hola, quiero hacer una regresión lineal lm(y ~ x * grupo, data =datos) y: numérica, x: numérica, grupo: factor con dos niveles (1 y 2) pero no calcula el coeficiente de x:grupo2 a cuenta de una singularidad Coefficients: (1 not defined because of singularities) Estimate Std. Error t value Pr(>|t|) (Intercept) -1.283e+06 2.276e+04 -56.359 < 2e-16 *** x
2006 Aug 11
1
help:coerce lmer.coef to matrix
Hi, Thanks for your response, it nearly worked! But it only wrote one coloumn of data and not the three columns I need. Using fixef(m1) doesnt give the same results as coef(m1) when you are using more than one random effect. I need the coefficients for each individual so I use coef(m1) to get this which results in an object of class lmer.coef, 3 columns by 700 rows. as.data.frame() wont work on
2003 Apr 24
2
R-1.7.0 build feedback: NetBSD 1.6 (PR#2837)
R-1.7.0 built on NetBSD 1.6, but the validation test suite failed: Machinetype: Intel Pentium III (600 MHz); NetBSD 1.6 (GENERIC) Remote gcc version: gcc (GCC) 3.2.2 Remote g++ version: g++ (GCC) 3.2.2 Configure environment: CC=gcc CXX=g++ LDFLAGS=-Wl,-rpath,/usr/local/lib make[5]: Entering directory `/local/build/R-1.7.0/src/library' >>> Building/Updating
2004 Jun 05
1
coef and vcoc for polr inconsistent??
Is the following an inconsistency, programming glitch or a feature? One would expect that vcov(obj) was the variance-covariance of coef(obj), but apparently this is not the case for polr objects: > x <- rnorm( 100 ) > y <- rnorm( 100 ) > ff <- factor( sample( 1:4, 100, replace=T ) ) > pm <- polr( ff ~ x + y ) > coef( pm ) x y 0.21219010 0.03558506