similar to: crossprod vs %*% timing

Displaying 20 results from an estimated 10000 matches similar to: "crossprod vs %*% timing"

2005 Oct 05
2
eliminate t() and %*% using crossprod() and solve(A,b)
Hi I have a square matrix Ainv of size N-by-N where N ~ 1000 I have a rectangular matrix H of size N by n where n ~ 4. I have a vector d of length N. I need X = solve(t(H) %*% Ainv %*% H) %*% t(H) %*% Ainv %*% d and H %*% X. It is possible to rewrite X in the recommended crossprod way: X <- solve(quad.form(Ainv, H), crossprod(crossprod(Ainv, H), d)) where quad.form() is a little
2002 Mar 15
1
Thought on crossprod
Hi everyone, I do a lot of work with large variance matrices, and I like to use "crossprod" for speed and to keep everything symmetric, i.e. I often compute "crossprod(Q %*% t(A))" for "A %*% Sigma %*% t(A)", where "Sigma" decomposes as "t(Q) %*% Q". I notice in the code that "crossprod", current definition > crossprod function (x,
2006 Nov 21
1
crossprod(x) vs crossprod(x,x)
I found out the other day that crossprod() will take a single matrix argument; crossprod(x) notionally returns crossprod(x,x). The two forms do not return identical matrices: x <- matrix(rnorm(3000000),ncol=3) M1 <- crossprod(x) M2 <- crossprod(x,x) R> max(abs(M1-M2)) [1] 1.932494e-08 But what really surprised me is that crossprod(x) is slower than crossprod(x,x): R>
2010 Mar 27
1
R runs in a usual way, but simulations are not performed
Dear addresses, I need perform a batch of 10 000 simulations for each of 4 options considered. (The idea is to obtain the parameter estimates in a heteroskedastic linear regression model - with additive or mixed heteroskedasticity - via the Kenward-Roger small-sample adjusted covariance matrix of disturbances). For this purpose I wrote an R program which would capture all possible options (true
2005 Jan 27
3
the incredible lightness of crossprod
The following is at least as much out of intellectual curiosity as for practical reasons. On reviewing some code written by novices to R, I came across: crossprod(x, y)[1,1] I thought, "That isn't a very S way of saying that, I wonder what the penalty is for using 'crossprod'." To my surprise the penalty was substantially negative. Handily the client had S-PLUS as
2002 Jul 14
1
crossprod and X %*% t(X)
hi, the help page for crossprod states that crossprod(A,B) is faster than t(A) %*% B; experimentation certainly bears this out. more alarming is the evidence that crossprod(t(A), B) is faster than A %*% B: on a PII laptop, 128MB memory, win98, R-1.5.0.-patched precompiled (no ATLAS): > A <- matrix(rnorm(250000),500,500) > B <- matrix(rnorm(250000),500,500) > for (i in 1:5) {
2003 Oct 17
2
Problems with crossprod
Dear R-users, I found a strange problem working with products of two matrices, say: a <- A[i, ] ; crossprod(a) where i is a set of integers selecting rows. When i is empty the result is in a sense random. After some trials the right answer (a matrix of zeros) appears. --------------- Illustration -------------------- R : Copyright 2003, The R Development Core Team Version 1.8.0
2008 Mar 10
1
crossprod is slower than t(AA)%*BB
Dear Rdevelopers The background for this email is that I was helping a PhD student to improve the speed of her R code. I suggested to replace calls like t(AA)%*% BB by crossprod(AA,BB) since I expected this to be faster. The surprising result to me was that this change actually made her code slower. > ## Examples : > > AA <- matrix(rnorm(3000*1000),3000,1000) > BB <-
2013 Oct 20
5
nlminb() - how do I constrain the parameter vector properly?
Greets, I'm trying to use nlminb() to estimate the parameters of a bivariate normal sample and during one of the iterations it passes a parameter vector to the likelihood function resulting in an invalid covariance matrix that causes dmvnorm() to throw an error. Thus, it seems I need to somehow communicate to nlminb() that the final three parameters in my parameter vector are used to
2003 Oct 30
3
Change in 'solve' for r-patched
The solve function in r-patched has been changed so that it applies a tolerance when using Lapack routines to calculate the inverse of a matrix or to solve a system of linear equations. A tolerance has always been used with the Linpack routines but not with the Lapack routines in versions 1.7.x and 1.8.0. (You can use the optional argument tol = 0 to override this check for computational
2003 Sep 07
3
bug in crossprod? (PR#4092)
# Your mailer is set to "none" (default on Windows), # hence we cannot send the bug report directly from R. # Please copy the bug report (after finishing it) to # your favorite email program and send it to # # r-bugs@r-project.org # ###################################################### # The last line of following code produces a segmentation fault: x <- 1:10 f <- gl(5,2)
2007 Jul 24
1
function optimization: reducing the computing time
Dear useRs, I have written a function that implements a Bayesian method to compare a patient's score on two tasks with that of a small control group, as described in Crawford, J. and Garthwaite, P. (2007). Comparison of a single case to a control or normative sample in neuropsychology: Development of a bayesian approach. Cognitive Neuropsychology, 24(4):343?372. The function (see
2011 Sep 06
1
repeatable segfault
Hi. macosx 10.6.8 With R-2.13.1 and also revision 56948 I get the following repeatable segfault: wt118:~% R --vanilla --quiet > R.Version() $platform [1] "x86_64-apple-darwin9.8.0" $arch [1] "x86_64" $os [1] "darwin9.8.0" $system [1] "x86_64, darwin9.8.0" $status [1] "" $major [1] "2" $minor [1] "13.1" $year [1]
2013 Mar 05
1
crossprod(): g77 versus gfortran
Hi I've got two builds of R, one using g77 (version 3.4.6) and the other using gfortran (version 4.1.2). The two builds are otherwise identical as far as I can tell. The one which used g77 performs crossprod()s roughly twice as fast as the gfortran one. I'm wondering if this rings a bell with anyone, and if so, are you aware of any configure settings which will improve the performance
2005 Oct 06
3
Singular matrix
Dear All, I have written the following programs to find a non-singular (10*10) covariance matrix. Here is the program: nitems <- 10 x <- array(rnorm(5*nitems,3,3), c(5,nitems)) sigma <- t(x)%*%x inverse <- try(solve(sigma), TRUE) while(inherits(inverse, "try-error")) { x <- array(rnorm(5*nitems,3,3), c(5,nitems)) sigma <- t(x)%*%x inverse <-
2005 Jun 29
6
x*x*x*... vs x^n
Hi I have been wondering if there one can speed up calculating small powers of numbers such as x^8 using multiplication. In addition, one can be a bit clever and calculate x^8 using only 3 multiplies. look at this: > f1 <- function(x){x*x*x*x*x*x*x*x} > f2 <- function(x){x^8} > f3 <- function(x){x2 <- x*x;x4 <- x2*x2;return(x4*x4)} [so f1() and f2() and f3() are
2016 Mar 25
2
summary( prcomp(*, tol = .) ) -- and 'rank.'
> On 25 Mar 2016, at 10:41 am, peter dalgaard <pdalgd at gmail.com> wrote: > > As I see it, the display showing the first p << n PCs adding up to 100% of the variance is plainly wrong. > > I suspect it comes about via a mental short-circuit: If we try to control p using a tolerance, then that amounts to saying that the remaining PCs are effectively zero-variance, but
2003 May 08
3
Avoiding loops to spare time and memory
Is it possible to avoid the loop in the following function (or make the function otherwise more efficient) and can someone point me to a possible solution? (It would be great if hours could be reduced to seconds :-). # --------------------------------------------- RanEigen=function(items=x,cases=y,sample=z) { X=matrix(rnorm(cases*items),nrow=cases,byrow=F) S=crossprod(X-rep(1,cases) %*%
2007 Oct 09
1
Multivariate chi-square distribution function
Dear All, Is there any function in R for computing "multivariate chi-square distribution"? How about "multivariate gamma distribution"? I appreciate any comment on this subject. Thank you, Amin Zollanvari PhD student Department of Electrical and Computer Engineering, Texas A&M University, College Station, TX
2007 Aug 13
1
simulate data from multivariate normal with pre-specified correlation matrix
For example, the correlation matrix is 3x3 and looks like 1 0.75 0 0 0 0.75 1 0 0 0 0 0 0 0 0 Can I write the code like this? p<- 3 # number of variables per observation N<- 10 # number of samples # define population correlation matrix sigma sigma<-matrix(0,p,p) #creates a px p matrix of 0 rank<-2 for (i in 1:rank){ for (j in 1:rank){ rho<-0.75