Displaying 20 results from an estimated 22 matches for "colinear".
Did you mean:
collinear
2004 Jun 30
1
linear models and colinear variables...
...ke this:
Bstaph.aureus:Dvan:Sr:U:ICU
There are a good number of hits but there's also a
staggering number of complete misses, due to a
combination of scare data in that particular niche and
actual lack of deviation from the categorical mean.
My suspicion is that there's a large degree of
colinearity in some of these variables that serves to
reduce the total effect of either of a nearly colinear
pair to an insignificant level; my hope is that
removing one of a mostly colinear group would allow
the other variables' possibly significant effects to
be measured.
Question 1) Is this legitima...
2006 Jul 05
2
Colinearity Function in R
Is there a colinearty function implemented in R? I
have tried help.search("colinearity") and
help.search("collinearity") and have searched for
"colinearity" and "collinearity" on
http://www.rpad.org/Rpad/Rpad-refcard.pdf but with no
success.
Many thanks in advance,
Peter Laure...
2006 Oct 24
2
colinearity?
I'm sorry to all those who are tired of seeing my email appear in need of
help. But, I've never coded in any program before, so this has been a
difficult process for me.
Is there a simple function to test for colinearity in R? I'm running a
logistic regression and a linear regression.
Thanks for the help!
[[alternative HTML version deleted]]
2011 Jan 06
1
Splitting a Vector
Hi all,
I read in a text book, that you can examine a variable that is colinear
with others, and giving different ANOVA output and explanatory power
when ordered differently in the model forula, by modelling that
explanatory variable, against the others colinear with it. Then, using
that information to split the vector (explanatory variable) in question,
into two new vect...
2008 Mar 06
0
Help with colinearity problem in multiple linear regression
...composition, but I can't figure out how this technique would be combined
with intermediate summary results which is required for the parallelism
technique to work.
Does anyone have any pointers how this might be accomplished?
Thanks,
Caleb
-------
# Bigger example
# Introduce artificial colinearity to test case
P = cbind(USArrests[2]*2, USArrests)
names(P) = c("Introduced", "Murder", "Assault", "UrbanPop", "Rape")
# Split the data into partitions to be calculated separately
# In the real case the full data would have been too large and
# e...
2004 Oct 16
3
Cox PH Warning Message
Hi,
Can anybody tell me what the message below means and how to overcome it.
Thanks,
Neil
Warning message:
X matrix deemed to be singular; variable 2 in: coxph(Surv(age_at_death,
death) ~ project$pluralgp + project$yrborn + .........
>
2010 Jan 07
1
logistic regression based on principle component analysis
Dear all:
I try to analyse a dataset which contain one binary response variable and serveral predict variables, but multiple colinear problem exists in my dataset, some paper suggest that logistic regression for principle components is suit for these noise data,
but i only find R can done principle component regression using "pls" package,
is there any package that can do the task i need - logistic regression based on...
2012 Nov 06
2
R and SPSS
Hi group:
I have a data set, which has severe colinearity problem. While running linear regression in R and SPSS, I got different models. I am wondering if somebody knows how to make the two software output the same results. (I guess the way R and SPSS handling singularity is different, which leads to different models.)
Thanks.
[[alternative HTML...
2011 May 16
1
Linear Discriminant Analysis error: "Variables appear constant"
...ouping, ...) :
variables 10 38 42 appear to be constant within groups
When I look at the variables listed, they don't appear "constant within the groups" to me. I'm new to LDA and am wondering what this error means... Are my data somehow not in the right format? Should I remove colinear variables? (All variables have been normalized.)
Thanks very much!
Katie
[[alternative HTML version deleted]]
2005 Oct 08
2
keeping interaction terms
...gt;<st1:place w:st="on">Crawley</st1:place> its single term has to be added to the multiple model: lrm(N~a*b+a+b).<o:p></o:p>
This nearly always leads to high correlation rates between the interaction term a*b and its single term a or b. With regards to the law of colinearity modelling should not include correlated variables with an Spearman index >0,7. Does this mean that the interaction term has to be discarded or can the variables stay within the model when correlated? I do not necessarily want to do a PCA on this issue.<o:p></o:p>
Thanks for helpi...
2012 Nov 10
1
colineraity among categorical variables (multinom)
Dear all users,
I"d like to ask you how to make decision about colinearity among
categorical independent variables
when the model is multinomial logistic regression.
Any help is appreciated,
Niklas
[[alternative HTML version deleted]]
2008 May 28
1
Fixing the coefficient of a regressor in formula
...+ G4 + G5 +G6
Gs represent a B-spline basis functions so they sum to 1 and I can't
estimate the model as is without getting the last coefficient to be NA,
which makes sense given the perfect collinearity.
without getting in lengthy details about my code, let me just say that
to avoid the colinearity problem,. I do not want to omit G1 from the
regression. Instead, I want to fix the regression coefficient of one of
the regressors, G1, to 1.
I have read the R manual section on formulae but I have not found how to
do fix a regression coefficient. Conceptually speaking it seems to me
that...
2009 Mar 12
2
MANOVA
...ns about MANOVA which I am still not sure if appropriately I should use it.
For example I have a data set like this:
BloodPressure (BP) Weight Height
120 115 165
125 145 198
156 99 176
I know that BloodPressure is correlated with both Weight and Height, however colinearity exists between Weight and Height. When I use BP = Weight + Height as the model, one is got to be insignificant. I was trying to use a BP + Weight = Height model, but not sure how to use it.
Should I use MANOVA? or I just have to do two equations as BP = Weight & Weight = Height
Any sugges...
2002 Sep 15
7
loess crash
Hi,
I have a data frame with 6563 observations. I can run a regression with
loess using four explanatory variables. If I add a fifth, R crashes. There
are no missings in the data, and if I run a regression with any four of the
five explanatory variables, it works. Its only when I go from four to five
that it crashes.
This leads me to believe that it is not an obvious problem with the data,
2011 Sep 09
0
Survival Analysis for soccer scoring process
...ng team because such teams are intrinsically more likely to
score.
*For the estimation controlled for fixed effects, a series of dummy
variables are added.
Two dummy variables are added for each team, one for home ground and another
for
away ground (with the exception of Manchester City to prevent colinearity
and act as
baseline hazard). * As there are in total 25 teams6 involving in the English
Premiership,
there are 48 dummy variables added. Similarly, there are 22 and 25 teams in
the
German Bundesliga and Spanish Primera Liga and therefore I add 42 and 48
dummy
variables respectively.
Fitting the...
2008 Sep 04
1
Stepwise
Hi,
Is there any facility in R to perform a stepwise process on a model,
which will remove any highly-correlated explanatory variables? I am told
there is in SPSS. I have a large number of variables (some correlated),
which I would like to just chuck in to a model and perform stepwise and
see what comes out the other end, to give me an idea perhaps as to which
variables I should focus on.
Thanks
2009 Aug 03
0
Deducer 0.1 : An intuitive cross-platform data analysis GUI
...plot
8.Generalized Linear Models
a. Model preview
b. Intuitive model builder
c. diagnostic plots
d. Component residual and added variable plots
e. Anova (type II and III implementing LR, Wald and F tests)
f. Parameter summary tables and parameter correlations
g. Influence and colinearity diagnostics
h. Post-hoc tests and confidence intervals
with (or without) adjustments for multiple testing.
i. Custom linear hypothesis tests
j. Effect mean summaries (with confidence intervals), and
plots
k. Exports: Residuals, Standardized residuals, Studentized
residuals, P...
2009 Aug 03
0
Deducer 0.1 : An intuitive cross-platform data analysis GUI
...plot
8.Generalized Linear Models
a. Model preview
b. Intuitive model builder
c. diagnostic plots
d. Component residual and added variable plots
e. Anova (type II and III implementing LR, Wald and F tests)
f. Parameter summary tables and parameter correlations
g. Influence and colinearity diagnostics
h. Post-hoc tests and confidence intervals
with (or without) adjustments for multiple testing.
i. Custom linear hypothesis tests
j. Effect mean summaries (with confidence intervals), and
plots
k. Exports: Residuals, Standardized residuals, Studentized
residuals, P...
2009 Jan 01
0
Computing/Interpreting Odds Ratios for 3-way interactions from lmer
...tions because specific hypotheses hinge on the results.
Two other points that may be relevant: 1) The original design was balanced, but the current results are unbalanced because of data loss (e.g. children failing to respond) and this is not randomly distributed across groups. 2) There is some colinearity between the conditions (corr between subord/when = .44) and between the groups (corr between SLI/MLU =.48). This is somewhat logical given the targets, but is not easily reduced.
The syntax I'm using for the analysis is:
clauseOPCyesI <- lmer(OPCorrect == "past" ~ group*Con...
2004 Oct 12
5
covariate selection?
Hello,
I am hoping someone can help me with the following multivariate issue:
I have a model consisting of about 50 covariates. I would like to
reduce this to about 5 covariate for the reduced model by combining
cofactors that are strongly correlated. Is there a package or function
that would help me with this in R? I appreciate any suggestions.
Thanks,
Ian