Hi, I was pointed by a request on R-help to the following problem with ar.ols(): R> set.seed(1) R> x <- matrix(rnorm(4 * 2), ncol = 2) R> ar.ols(x, order.max = 1, aic = FALSE, demean = FALSE) Error in if ((dimension < 1) | (dimension > n)) stop("wrong embedding dimension") : argument is of length zero In addition: Warning message: In log(det(varE[[m - order.min + 1L]])) : NaNs produced This happens on my 32-bit Debian (i686-pc-linux-gnu), both in R-release and R-devel. The source is a numerical instability in the computations of the error variance and subsequent AIC. YH <- A[[m - order.min + 1L]] %*% t(X) E <- (Y - YH) varE[[m - order.min + 1L]] <- E %*% t(E)/N [...] aic[m - order.min + 1L] <- n.used * log(det(varE[[m - order.min + 1L]])) + 2 * nser * (nser * m + intercept) varE is the cross-product of the errors E and should be positive-definite but here det(varE[[1]]) is -6.920697e-17 (on my machine) and thus taking logs gives NaN yielding an aic of NaN for which the minimum cannot be determined. Of course, it does not make much sense to fit such a VAR model but either a more meaningful error or a workaround would be useful. For example one could take log(max(0, det(varE[[m - order.min + 1L]]))) to avoid the negative determinant problem. Best, Z