similar to: Generating a stochastic matrix with a specified second dominant eigenvalue

Displaying 20 results from an estimated 5000 matches similar to: "Generating a stochastic matrix with a specified second dominant eigenvalue"

2009 Dec 01
1
eigenvalues of complex matrices
Dear all, I want to compute the eigenvalues of a complex matrix for some statistics. Comparing it to its matlab/octave sibling, I don't get the same eigenvalues in R computing it from the exact same matrix. In R, I used eigen() and arpack() that give different eigenvalues. In matlab/octave I used eig() and eigs() that give out the same eigenvalues but different to the R ones. For real
2009 Nov 25
1
R: Re: R: Re: chol( neg.def.matrix ) WAS: Re: Choleski and Choleski with pivoting of matrix fails
Dear Peter, thank you very much for your answer. My problem is that I need to calculate the following quantity: solve(chol(A)%*%Y) Y is a 3*3 diagonal matrix and A is a 3*3 matrix. Unfortunately one eigenvalue of A is negative. I can anyway take the square root of A but when I multiply it by Y, the imaginary part of the square root of A is dropped, and I do not get the right answer. I tried
2006 Jun 24
2
smoothing splines and degrees of freedom
Hi, If I set df=2 in my smooth.spline function, is that equivalent to running a linear regression through my data? It appears that df=# of data points gives the interpolating spline and that df = 2 gives the linear regression, but I just want to confirm this. Thank you, Steven
2010 Jun 03
3
General-purpose GPU computing in statistics (using R)
Hi All, I have been reading about general purpose GPU (graphical processing units) computing for computational statistics. I know very little about this, but I read that GPUs currently cannot handle double-precision floating points and also that they are not necessarily IEEE compliant. However, I am not sure what the practical impact of this limitation is likely to be on computational
2004 Dec 03
3
Computing the minimal polynomial or, at least, its degree
Hi, I would like to know whether there exist algorithms to compute the coefficients or, at least, the degree of the minimal polynomial of a square matrix A (over the field of complex numbers)? I don't know whether this would require symbolic computation. If not, has any of the algorithms been implemented in R? Thanks very much, Ravi. P.S. Just for the sake of completeness, a
2009 Jul 02
1
lokern package
Dear Martin, I have been playing a lot with the glkerns() function in the "lokern" package for "automatic" smoothing of time-series data. This kernel smoothing approach of Gasser and Mueller seems to perform quite well for estimating the function and its derivatives (first and second derivatives). In fact, this is one of the best methods based on my simulation studies for
2007 Jun 29
4
Dominant eigenvector displayed as third (Marco Visser)
Dear R users & Experts, This is just a curiousity, I was wondering why the dominant eigenvetor and eigenvalue of the following matrix is given as the third. I guess this could complicate automatic selection procedures. 0 0 0 0 0 5 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 Please
2009 Jul 02
2
constrained optimisation in R.
i want to estimate parameters with maximum likelihood method with contraints (contant numbers). for example sum(Ai)=0 and sum(Bi)=0 i have done it without the constraints but i realised that i have to use the contraints. Without constraints(just a part-not complete): skellamreg_LL=function(parameters,z,design) { n=length(z); mu=parameters[1]; H=parameters[2]; Apar=parameters[3:10];
2009 Nov 12
2
A combinatorial optimization problem: finding the best permutation of a complex vector
Hi, I have a complex-valued vector X in C^n. Given another complex-valued vector Y in C^n, I want to find a permutation of Y, say, Y*, that minimizes ||X - Y*||, the distance between X and Y*. Note that this problem can be trivially solved for "Real" vectors, since real numbers possess the ordering property. Complex numbers, however, do not possess this property. Hence the
2004 Jun 28
3
How to determine the number of dominant eigenvalues in PCA
Dear All, I want to know if there is some easy and reliable way to estimate the number of dominant eigenvalues when applying PCA on sample covariance matrix. Assume x-axis is the number of eigenvalues (1, 2, ....,n), and y-axis is the corresponding eigenvalues (a1,a2,..., an) arranged in desceding order. So this x-y plot will be a decreasing curve. Someone mentioned using the elbow (knee)
2009 Apr 26
1
Stochastic Gradient Ascent for logistic regression
Hi. guys, I am trying to write my own Stochastic Gradient Ascent for logistic regression in R. But it seems that I am having convergence problem. Am I doing anything wrong, or just the data is off? Here is my code in R - lbw <- read.table("http://www.biostat.jhsph.edu/~ririzarr/Teaching/754/lbw.dat" , header=TRUE) attach(lbw) lbw[1:2,] low age lwt race smoke ptl ht ui ftv
2012 Apr 19
3
Solve an ordinary or generalized eigenvalue problem in R?
Folks: I'm trying to port some code from python over to R, and I'm running into a wall finding R code that can solve a generalized eigenvalue problem following this function model: http://docs.scipy.org/doc/scipy/reference/generated/scipy.linalg.eig.html Any ideas? I don't want to call python from within R for various reasons, I'd prefer a "native" R solution if one
2012 Mar 09
1
Eigenvalue calculation of sparse matrices
Dear all, I am currently working on the calculation of eigenvalues (and -vectors) of large matrices. Since these are mostly sparse matrices and I remember some specific functionalities in MATLAB for sparse matrices, I started a research how to optimize the calculation of eigenvalues of a sparse matrix. The function eigen itself works with the LAPACK library which has no special handling for
2008 Jan 14
1
stochastic growth rate (package biopop)
Dear all, I am running matrix population models using package "popbio". In a deterministic model {i.e., transition matrix is defined as A <- matrix(c(0.70, 0.70,0.35,0.50), nrow=2,byrow=TRUE}, population growth rate can be estimated from the dominant eigenvalue {command "eigen.analysis"}. However, I cannot figure out the way to compute the asymptotic stochastic population
2010 May 31
0
Documentation of biplot for princomp
Hi, I think that the documentation for the biplot function `biplot.princomp' is inconsistent with what it actually does. Here is what the documentation states: pc.biplot If true, use what Gabriel (1971) refers to as a "principal component biplot", with lambda = 1 and observations scaled up by sqrt(n) and variables scaled down by sqrt(n). Then inner products between
2007 Nov 28
1
simulating a 2-parameter integrated ornstein-uhlenbeck process?
hello everyone, i'm trying to simulate a 2-parameter integrated ornstein-uhlenbeck (IOU) process, but i'm not sure exactly where to start (which package, which function). the motivation is the paper by taylor et. al. (JASA 1994) "a stochastic model for the analysis of longitudinal aids data." the model they suggest consists of a combination of fixed and random effects, a
2003 Apr 15
1
Simulation of Stochastic processes
Hi: I was wondering whether I can find some help for computer simulation of stochastic processes (e.g. Brownian motion), for pedagogicl/instructional purposes. Any help would be appreciated. thanks, Ravi.
2012 Apr 27
2
find the eigenvector corresponding to the largest eigenvalue
Hi, If I use the eigen() function to find the eigenvalues of a matrix, how can I find the eigenvector corresponding to the largest eigen value? Thanks! [[alternative HTML version deleted]]
2009 Jul 17
6
Solving two nonlinear equations with two knowns
Dear R users, I have two nonlinear equations, f1(x1,x2)=0 and f2(x1,x2)=0. I try to use optim command by minimize f1^2+f2^2 to find x1 and x2. I found the optimal solution changes when I change initial values. How to solve this? BTW, I also try to use grid searching. But I have no information on ranges of x1 and x2, respectively. Any suggestion to solve this question? Thanks, Kate
2005 Jun 06
3
(Off topic.) Observed Fisher information.
I have been building an R function to calculate the ***observed*** (as opposed to expected) Fisher information matrix for parameter estimates in a rather complicated setting. I thought I had it working, but I am getting a result which is not positive definite. (One negative eigenvalue. Out of 10.) Is it the case that the observed Fisher information must be positive definite --- thereby