similar to: how to extract test for collinearity and constantcy used in lda

Displaying 20 results from an estimated 3000 matches similar to: "how to extract test for collinearity and constantcy used in lda"

2012 Apr 03
1
how to use condition indexes to test multi-collinearity
Dear Users, I try to calculate condition indexes and variance decomposition proportions in order to test for collinearity using colldiag() in perturb package, I got a large index and two variables with large variance decomposition proportions,but one of them is constant item.I also checked the VIF for that variable, the value is small.The result is as follows: Index intercept V1
2003 Feb 24
1
Mass: lda and collinear variables
hello list, when I use method lda of the MASS package I experience a warning: variables are collinear in: lda.default(data[train, ], classes[train]) Is there an easy way to recover from this issue within the MASS package? Or how can I tell how severe this issue is at all? I understand that I shouldn't use lda at all with collinear data and should use "quadratische" (squared?)
2002 Jul 15
2
meaning of error message about collinearity
You are using a method that needs to estimate the covariance matrix of all the variables. If you have 80 variables, there are (80+1)*80/2 = 3240 variances and covariances to estimate. How many data points do you think you need to do that? Some people assume the covariance matrix is diagonal (i.e., assuming all the variables are uncorrelated). Even then you still have 80 variances to estimate.
2005 Mar 31
0
perturb package for evaluating collinearity
I've uploaded the R package "perturb" to CRAN. Perturb contains two programs for evaluating collinearity. "Colldiag" calculates condition indexes and variance decomposition proportions to detect and track down collinear sets of variables. "Perturb" takes a different approach. It re-estimates the model a specified number of times, adding random noise
2005 Mar 31
0
perturb package for evaluating collinearity
I've uploaded the R package "perturb" to CRAN. Perturb contains two programs for evaluating collinearity. "Colldiag" calculates condition indexes and variance decomposition proportions to detect and track down collinear sets of variables. "Perturb" takes a different approach. It re-estimates the model a specified number of times, adding random noise
2013 Feb 10
4
different behavior of $ with string literal vs string variable as argument
Hi everyone, I ran into the issue below while trying to execute a command of the form apply(list.names,1, function(x) F(favorite.list$x) ) where list.names is a character vector containing the names of the elements of favorite.list and F is some function defined on a list element. Namely, the $ operator doesn't treat the string variable 'x' as the string it represents, so that,
2012 Dec 04
3
odd behavior of browser()
Hi everyone, I normally include a call to browser() as I'm working out the kinks in my scripts, and I am always able to step through each line by hitting "Return", but for some reason, in the scripts I'm working on now, hitting "Return" seems to cause execution of *all* the lines in my script. I've restarted R several times in case it was stuck in a bad state for
2009 Jul 21
2
Collinearity in Linear Multiple Regression
Dear all, How can I test for collinearity in the predictor data set for multiple linear regression. Thanks Alex [[alternative HTML version deleted]]
2012 Jul 26
0
lda, collinear variables and CV
Dear R-help list, apparently lda from the MASS package can be used in situations with collinear variables. It only produces a warning then but at least it defines a classification rule and produces results. However, I can't find on the help page how exactly it does this. I have a suspicion (it may look at the hyperplane containing the class means, using some kind of default/trivial
2003 Jun 30
1
Novice Questions
I'm writing a program to perform linear regressions to estimate the number of bank teller transactions per hour of various types based upon day of week, time of day, week of month and several prices. I've got about 25,000 records in my dataset, 85 columns of transaction counts (used 1 at a time), about 50 columns of binary indicators (day, week, pay period, hour, branch), and a half dozen
2013 Feb 06
3
how to "multiply" list of matrices by list of vectors
Hi everyone, I'd like to be able to apply lda to each 2D matrix slice of a 3D array, and then use the scalings to obtain the corresponding lda scores. I can use 'apply' to get a list of the lda output for each 2D slice, and can create a list of the resulting scalings, but I'm not sure how to multiply them in a vectorized way. Here's how I made a list of 2D matrices
2008 Nov 20
1
Checking collinearity using lmer
I am running a logistic regression model with a random effect using lmer. I am uncertain how to check for collinearity between my parameters. I have already run cor() and linear regression for each combination of parameters, and all Rsqr values were <0.8….but I am analyzing ecological data so a 0.8 cutoff may be unrealistic. -is there a way to check variance inflation factors or tolerance
2007 Dec 27
1
Lda and Qda
Hi all, I'm working with some data: 54 variables and a column of classes, each observation as one of a possible seven different classes: > var.can3<-lda(x=dados[,c(1:28,30:54)],grouping=dados[,55],CV=TRUE) Warning message: In lda.default(x, grouping, ...) : variables are collinear > summary(var.can3) Length Class Mode class 30000 factor numeric ### why?? I
2005 Jun 30
2
Finding out collinearity in regression
Hi, I am trying to find out a collinearity in explanatory variables with alias(). I creat a dataframe: dat <- ds[,sapply(ds,nlevels)>=2] dat$Y <- Response Explanatory variables are factor and response is continuous random variable. When I run a regression, I have the following error: fit <- aov( Y ~ . , data = dat) Error in "contrasts<-"(`*tmp*`, value =
2003 Jul 23
6
Condition indexes and variance inflation factors
Has anyone programmed condition indexes in R? I know that there is a function for variance inflation factors available in the car package; however, Belsley (1991) Conditioning Diagnostics (Wiley) notes that there are several weaknesses of VIFs: e.g. 1) High VIFs are sufficient but not necessary conditions for collinearity 2) VIFs don't diagnose the number of collinearities and 3) No one has
2012 Nov 15
4
using ifelse to remove NA's from specific columns of a data frame containing strings and numbers
Hi everyone, I have a data frame one of whose columns is a character vector and the rest are numeric, and in debugging a script, I noticed that an ifelse call seems to be coercing the character column to a numeric column, and producing unintended values as a result. Roughly, here's what I tried to do: df: a data frame with, say, the first column as a character column and the second and
2004 Jan 21
1
outlier identification: is there a redundancy-invariant substitution for mahalanobis distances?
Dear R-experts, Searching the help archives I found a recommendation to do multivariate outlier identification by mahalanobis distances based on a robustly estimated covariance matrix and compare the resulting distances to a chi^2-distribution with p (number of your variables) degrees of freedom. I understand that compared to euclidean distances this has the advantage of being scale-invariant.
2013 May 03
2
how to parallelize 'apply' across multiple cores on a Mac
Hi everyone, I'm trying to use apply (with a call to zoo's rollapply within) on the columns of a 1.5Kx165K matrix, and I'd like to make use of the other cores on my machine to speed it up. (And hopefully also leave more memory free: I find that after I create a big object like this, I have to save my workspace and then close and reopen R to be able to recover memory tied up by R, but
2009 Mar 13
1
lsfit w/ rank-deficient x
Dear R-devel, It seems that lsfit incorrectly reports coefficients when the input matrix 'x' is rank-deficient, see the example below: ## here values of 'b' and 'c' are incorrectly swapped > x <- cbind(a=rnorm(100), b=0, c=rnorm(100)); y <- rnorm(100); lsfit(x, y)$coef Intercept a b c -0.0227787 0.1042860 -0.1729261 0.0000000 Warning
2012 Dec 12
2
using 'apply' to apply princomp to an array of datasets
Hi everyone, Suppose I have a 3D array of datasets, where say dimension 1 corresponds to cases, dimension 2 to datasets, and dimension 3 to observations within a dataset. As an example, suppose I do the following: > x <- sample(1:20, 48, replace=TRUE) > datasets <- array(x, dim=c(4,3,2)) Here, for each j=1,2,3, I'd like to think of datasets[,j,] as a single data matrix with