similar to: summary of aov fit on a contrast basis

Displaying 20 results from an estimated 120 matches similar to: "summary of aov fit on a contrast basis"

2011 May 02
2
Lasso with Categorical Variables
Hi! This is my first time posting. I've read the general rules and guidelines, but please bear with me if I make some fatal error in posting. Anyway, I have a continuous response and 29 predictors made up of continuous variables and nominal and ordinal categorical variables. I'd like to do lasso on these, but I get an error. The way I am using "lars" doesn't allow for the
2012 Mar 16
1
multivariate regression and lm()
Hello, I would like to perform a multivariate regression analysis to model the relationship between m responses Y1, ... Ym and a single set of predictor variables X1, ..., Xr. Each response is assumed to follow its own regression model, and the error terms in each model can be correlated. Based on my readings of the R help archives and R documentation, the function lm() should be able to
2011 Jul 13
1
AR-GARCH with additional variable - estimation problem
Dear list members, I am trying to estimate parameters of the AR(1)-GARCH(1,1) model. I have one additional dummy variable for the AR(1) part. First I wanted to do it using garchFit function (everything would be then estimated in one step) however in the fGarch library I didn't find a way to include an additional variable. That would be the formula but, as said, I think it is impossible to add
2008 Feb 07
1
Don't understand removing constant on 1-way ANOVA
I am playing with the a 1-way anova with and without the "-1" option. I have a simple cooked up example below but it behaves the same on a more complex real example. From what I can tell: 1) the estimated means of the different levels are correctly estimated either way (although reported as means with the -1 and as contrasts without the -1 as expected) 2) the residuals are
2005 May 04
3
Multivariate multiple regression
I'd like to model the relationship between m responses Y1, ..., Ym and a single set of predictor variables X1, ..., Xr. Each response is assumed to follow its own regression model, and the error terms in each model can be correlated. My understanding is that although lm() handles vector Y's on the left-hand side of the model formula, it really just fits m separate lm models. What should
2011 Apr 13
4
is this an ANOVA ?
Hi all, I have a very easy questions (I hope). I had measure a property of plants, growing in three different substrates (A, B and C). The rest of the conditions remained constant. There was very high variation on the results. I want to do address, whether there is any difference in the response (my measurement) from substrate to substrate?
2009 Nov 05
1
help with ols and contrast functions in Design library
Dear All, I'm trying to use the ols function in the Design library (version 2.1.1) of R to estimate parameters of a linear model, and then use the contrast function in the same library to test various contrasts. As a simple example, suppose I have three factors: feature (3 levels), group (2 levels), and patient (3 levels). Patient is coded as a non-unique identifier and is
2010 Aug 05
2
linear model with similar response predictor
Hi, can somebody tell me why R is not able to calculate a linear model written in this way? > lm (seq(1:100)~seq(1:100)) Call: lm(formula = seq(1:100) ~ seq(1:100)) Coefficients: (Intercept) 50.5 Warning messages: 1: In model.matrix.default(mt, mf, contrasts) : the response appeared on the right-hand side and was dropped 2: In model.matrix.default(mt, mf, contrasts) : problem
2010 Jan 08
2
how to get perfect fit of lm if response is constant
Hello. Consider the response-variable of data.frame df is constant, so analytically perfect fit of a linear model is expected. Fitting a regression line using lm result in residuals, slope and std.errors not exactly zero, which is acceptable in some way, but errorneous. But if you use summary.lm it shows inacceptable error propagation in the calculation of the t value and the corresponding
2005 Feb 09
2
[Fwd: Re: Fw: Contour plot]
Petr, It works perfectly! But I still have a question; I have fit the following data; x,y,z 1,10,11 2,11,15 3,12,21 4,13,29 5,14,39 6,15,51 7,16,65 8,17,81 9,18,99 10,19,119 >dat.lm <- lm(z~I(x^2)+y, data=dat) >dat.lm Call: lm(formula = z ~ I(x^2) + y, data = dat) Coefficients: (Intercept) I(x^2) y 1.841e-14 1.000e+00 1.000e+00 How do I create the
2012 Aug 20
1
Combining imputed datasets for analysis using Factor Analysis
Dear R users and developers, I have a dataset containing 34 variables measured in a survey, which has some missing items. I would like to conduct a factor analysis of this data. I tested mi, Amelia, and MissForest as alternative packages in order to impute the missing data. I now have 5 separate datasets with the variables I am interested in factor analysing. In my reading of the package
2009 Sep 06
2
How to figure the type of a variable?
Hi, I want to know what is there returned values of 'lm'. 'class' and 'lm' does not show that the returned value has the variable coefficients, etc. I am wondering what is the command to show the detailed information. If possible, I aslo want the lower level information. For example, I want to show that 'coefficients' is a named list and it has 2 elements.
2007 Jul 24
4
values from a linear model
Dear R users, how can I extrapolate values listed in the summary of an lm model but not directly available between object values such as the the standard errors of the calculated parameters? for example I got a model: mod <- lm(Crd ~ 1 + Week, data=data) and its summary: > summary(mod) Call: lm(formula = Crd ~ 1 + Week, data = data, model = TRUE, y = TRUE) Residuals: Min
2010 Aug 23
1
Fitting Weibull Model with Levenberg-Marquardt regression method
Hi, I have a problem fitting the following Weibull Model to a set of data. The model is this one: a-b*exp(-c*x^d) If I fitted the model with CurveExpert I can find a very nice set of coefficients which create a curve very close to my data, but when I use the nls.lm function in R I can't obtain the same result. My data are these: X Y 15 13 50 13 75 9 90 4 With the commercial
2017 Sep 05
4
Interesting behavior of lm() with small, problematic data sets
I've recently come across the following results reported from the lm() function when applied to a particular type of admittedly difficult data. When working with small data sets (for instance 3 points) with the same response for different predicting variable, the resulting slope estimate is a reasonable approximation of the expected 0.0, but the p-value of that slope estimate is a surprising
2018 Mar 05
2
data analysis for partial two-by-two factorial design
But of course the whole point of additivity is to decompose the combined effect as the sum of individual effects. "Mislead" is a subjective judgment, so no comment. The explanation I provided is standard. I used it for decades when I taught in industry. Cheers, Bert Bert Gunter "The trouble with having an open mind is that people keep coming along and sticking things into
2009 Sep 06
1
How to refer the element in a named list?
Hi, I thought that 'coefficients' is a named list, but I can not refer to its element by something like r$coefficients$y. I used str() to check r. It says the following. Can somebody let me know what it means? ..- attr(*, "names")= chr [1:2] "(Intercept)" "y" $ Rscript lm.R > x=1:10 > y=1:10 > r=lm(x~y) > class(r) [1] "lm" >
2002 Mar 14
10
Samba and Windows XP
Hello, As we will migrate to Win XP Client, is there a site, document, etc ... where I can find information about tests and results of Win XP behaviour with SAMBA ? Thanks Orazio
2001 Sep 30
2
non linear models
Dear Members of the Help List, Honestly, I feel a little bit stupid - I would like to do something rather simple: fit a non linear model to existing data, to be more precise I wanted to start with simple higher order polynomials. Unfortunately, I do not quite understand the examples in the helpfiles for the nlm, nls and nlsModel commands. Could anyone please provide a simple example to get me
2004 Feb 06
1
problem to get coefficient from lm()
Dear all, The following is a example that I run and hope to get a linear model. However, I find the lm() can not give correct coefficients for the linear model. I hope it's just my own mistake. Please help. TIA. Regards, Jinsong > x [1] 3.760216 3.997288 3.208872 3.985417 3.265704 3.497505 2.923540 3.193937 [9] 3.102787 3.419574 3.169374 2.928510 3.153821 3.100385 3.768770 3.610583