similar to: Getting bootstrap statistic to work

Displaying 20 results from an estimated 30000 matches similar to: "Getting bootstrap statistic to work"

2002 Sep 24
2
help with bootstrap
Hi there, I'm stuck, but since I just started learning R, this might be a trivial problem. I need to do a bootstrap on the variance among the eigenvalues of a matrix. I can get this variance doing this: >var.eigenvalues=function(x) >var(eigen(cov(x), symmetric = T, only.values = T)$values) but if I try to run: >matrix=read.table("matrix.txt", header=T)
2005 Nov 07
1
Newbie on functions
Hi, I'm trying to write a simple function like case1 <- function (m, cov, Q, R) { theta <- (acos(R/sqrt(Q^3))) beta <- (-2)*sqrt(Q)*cos(theta/3)+m[1]/3 rho1 <- (-2)*sqrt(Q)*cos((theta+2*pi)/3)+m[1]/3 rho2 <- (-2)*sqrt(Q)*cos((theta-2*pi)/3)+m[1]/3 stderrb <- deltamethod( ~(-2)*sqrt(Q)*cos(theta/3)+x1/3,m,cov) stderrr1 <- deltamethod(
2010 Mar 19
1
Howto get unnormalized eigenvectors?
Hi, I try to calculate the angle between two first eigenvectors of different covariance matrices of biological phenotypic traits for different populations. My issue here is, that all possibilities to do so seem to normalize the eigenvectors to length 1. Although the helpfile of eigen() states, that using eigen(, symmetric = FALSE, EISPACK =TRUE) skips normalization this is (I guess) not applicable
2012 Dec 27
2
Bootstrap
Hola, buenas tardes estoy intentando hacer un bootstrap de un modelo, pero me da el siguiente error: "Error in FUN(newX[, i], ...) : unused argument(s) (list(age = c(33, 47, 49, 56, 60, 64, 64, 66, 68, 69, 71, 71, 72, 73, 74, 75, 75, 76, 78, 81, 83, 83, 36, 43, 46, 47, 49, 49, 51, 51, 52, 52, 53, 54, 54, 54, 55, 56, 56, 57, 57, 58, 58, 58, 58, 59, 59, 60, 61, 62, 63, 64, 65, 65, 66, 66,
2000 Sep 29
2
non-ideal behavior in princomp/ not a feature but a bug
... I checked and Brian and I are both right (see bottom for prior mail exchange). Let me explain: ============================================================= 1. Indeed, in principle, princomp allows data matrices with are wider than high. Example: > x1 [,1] [,2] [,3] [,4] [1,] 1 1 2 2 [2,] 1 1 2 2 > princomp(x1) Call: princomp(x = x1) Standard deviations:
2000 Sep 29
2
non-ideal behavior in princomp/ not a feature but a bug
... I checked and Brian and I are both right (see bottom for prior mail exchange). Let me explain: ============================================================= 1. Indeed, in principle, princomp allows data matrices with are wider than high. Example: > x1 [,1] [,2] [,3] [,4] [1,] 1 1 2 2 [2,] 1 1 2 2 > princomp(x1) Call: princomp(x = x1) Standard deviations:
2011 Mar 28
1
maximum likelihood accuracy - comparison with Stata
Hi everyone, I am looking to do some manual maximum likelihood estimation in R. I have done a lot of work in Stata and so I have been using output comparisons to get a handle on what is happening. I estimated a simple linear model in R with lm() and also my own maximum likelihood program. I then compared the output with Stata. Two things jumped out at me. Firstly, in Stata my coefficient
2011 Feb 14
4
sem problem - did not converge
Someone can help me? I tried several things and always don't converge # Model library(sem) dados40.cov <- cov(dados40,method="spearman") model.dados40 <- specify.model() F1 -> Item11, lam11, NA F1 -> Item31, lam31, NA F1 -> Item36, lam36, NA F1 -> Item54, lam54, NA F1 -> Item63, lam63, NA F1 -> Item65, lam55, NA F1 -> Item67, lam67, NA F1 ->
2009 Jun 25
2
Error: system is computationally singular: reciprocal condition number
I get this error while computing partial correlation. *Error in solve.default(Szz) : system is computationally singular: reciprocal condition number = 4.90109e-18* Why is it?Can anyone give me some idea ,how do i get rid it it? This is the function i use for calculating partial correlation. pcor.mat <- function(x,y,z,method="p",na.rm=T){ x <- c(x) y <- c(y)
2001 Nov 21
2
dw statistic
Hello Uwe First, I want to thank you for spending your time replying to my mail. I'm very impressed with the speed that my question was answered. I'm new at R (about two weeks) and reading your mail made me realize that it was indeed a question of vectors of different lengths. I thinked that I could create a function ("carfun") without creating a "x" vector, since
2000 Jun 15
1
prcomp help: is this a typo?
Dear All, The help for prcomp, under "Value" says: sdev: the standard deviation of the principal components (i.e., the eigenvalues of the cov matrix, though the calculation is actually done with the singular values of the data matrix). The way I read it, it implies that the sdev are the eigenvalues, but I think that sdev is actually the square root of the
2000 Jan 31
1
Feature requests for princomp(.) : Allow cor() specifications
(all in subject). If I want to do a PC analysis in a situation with missing data, I may want to have same flexibility as with "cor(.)", e.g., I may want princomp(x, ..., use.obs = "pairwise.complete") Actually, I may want even more flexibility. Currently, princomp(.) has if (cor) cv <- get("cor", envir = .GlobalEnv)(z) else cv <-
2008 May 16
1
Dimensions of svd V matrix
Hi, I'm trying to do PCA on a n by p wide matrix (n < p), and I'd like to get more principal components than there are rows. However, svd() only returns a V matrix of with n columns (instead of p) unless the argument nv=p is set (prcomp calls svd without setting it). Moreover, the eigenvalues returned are always min(n, p) instead of p, even if nv is set: > x <-
2009 Apr 12
3
p-values from bootstrap - what am I not understanding?
Dear stats experts: Me and my little brain must be missing something regarding bootstrapping. I understand how to get a 95%CI and how to hypothesis test using bootstrapping (e.g., reject or not the null). However, I'd also like to get a p-value from it, and to me this seems simple, but it seems no-one does what I would like to do to get a p-value, which suggests I'm not understanding
2008 Nov 26
1
Request for Assistance in R with NonMem
Hi I am having some problems running a covariate analysis with my colleage using R with the NonMem program we are using for a graduate school project. R and NonMem run fine without adding in the covariates, but the program is giving us a problem when the covariate analysis is added. We think the problem is with the R code to run the covariate data analysis. We have the control stream, R code
2008 Sep 09
1
Addendum to wishlist bug report #10931 (factanal) (PR#12754)
--=-hiYzUeWcRJ/+kx41aPIZ Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: 8bit Hi, on March 10 I filed a wishlist bug report asking for the inclusion of some changes to factanal() and the associated print method. The changes were originally proposed by John Fox in 2005; they make print.factanal() display factor correlations if factanal() is called with rotation =
2003 Apr 11
2
princomp with not non-negative definite correlation matrix
$ R --version R 1.6.1 (2002-11-01). So I would like to perform principal components analysis on a 16X16 correlation matrix, [princomp(cov.mat=x) where x is correlation matrix], the problem is princomp complains that it is not non-negative definite. I called eigen() on the correlation matrix and found that one of the eigenvectors is close to zero & negative (-0.001832311). Is there any way
2005 Jun 24
1
Mahalanobis distances
Dear R community Have just recently got back into R after a long break and have been amazed at how much it has grown, and how active the list is! Thank you so much to all those who contribute to this amazing project. My question: I am trying to calculate Mahalanobis distances for a matrix called "fgmatrix" >dim(fgmatrix) [1] 76 15 >fg.cov <- cov.wt(fgmatrix)
2004 Feb 02
1
glm.poisson.disp versus glm.nb
Dear list, This is a question about overdispersion and the ML estimates of the parameters returned by the glm.poisson.disp (L. Scrucca) and glm.nb (Venables and Ripley) functions. Both appear to assume a negative binomial distribution for the response variable. Paul and Banerjee (1998) developed C(alpha) tests for "interaction and main effects, in an unbalanced two-way layout of counts
2002 Apr 10
4
Principal Component analysis question
I have a question about princomp(mva) that I hope isn't too newbie. I used the sample data from Table 1.1 in "Manly (1986/1994) Multivariate Statistical Methods: a primer. Chapman and Hall" on sparrow body measurements. I rescaled the data to mean 0 and SD 1, and the covariance matrix is: V1 V2 V3 V4 V5 V1 1.0000000 0.7349642 0.6618119