Displaying 20 results from an estimated 26 matches for "orthonorm".
Did you mean:
orthonormal
2003 Sep 01
1
Gram-Schmidt orthonormal factorization
Hi:
Does R have a function as gsorth is SAS, that perform a the Gram-Schmidt
orthonormal factorization of the m ?n matrix A, where m is greater than or
equal to n? That is, the GSORTH subroutine in SAS computes the
column-orthonormal m ?n matrix P and the upper triangular n ?n matrix T such
that A = P*T.
or any other version of Gram-Schmidt orthonormal factorization?
I search the h...
2004 Feb 23
2
orthonormalization with weights
Hello List,
I would like to orthonormalize vectors contained in a matrix X taking into
account row weights (matrix diagonal D). ie, I want to obtain Z=XA with
t(Z)%*%D%*%Z=diag(1)
I can do the Gram-Schmidt orthogonalization with subsequent weighted
regressions. I know that in the case of uniform weights, qr can do the
trick. I wo...
2008 Nov 03
1
qr() and Gram-Schmidt
...0)
c <- c(2,1,0)
x <- matrix(c(a,b,c),3,3)
##########################
# Gram-Schmidt
##########################
A <- matrix(a,3,1)
q1 <- (1/sqrt(sum(A^2)))*A
B <- b - (q1%*%b)%*%q1
q2 <- (1/sqrt(sum(B^2)))*B
C <- c - (q1%*%c)%*%q1 - (q2%*%c)%*%q2
q3 <- (1/sqrt(sum(C^2)))*C
Orthonormal.basis <- matrix(c(q1,q2,q3),3,3)
> Orthonormal.basis
[,1] [,2] [,3]
[1,] 0.7071068 0.7071068 0
[2,] 0.0000000 0.0000000 1
[3,] 0.7071068 -0.7071068 0
##########################
# QR Factorisation X = QR
##########################
x.qr <- qr(x)
Q...
2003 Aug 13
3
A question on orthogonal basis vectors
Hey, R-listers,
I have a question about determining the orthogonal
basis vectors.
In the d-dimensinonal space, if I already know
the first r orthogonal basis vectors, should I be
able to determine the remaining d-r orthognal basis
vectors automatically?
Or the answer is not unique?
Thanks for your attention.
Fred
2003 Feb 14
2
How to solve A'A=S for A
It is not clear to me that one can. If the singular value decomposition
of A is the triple product P d Q', then the singular value decomposition
of A'A=S is Q d^2 Q'. The information about the orthonormal matrix P is
lost, is it not?
**********************************************************
Cliff Lunneborg, Professor Emeritus, Statistics &
Psychology, University of Washington, Seattle
Visiting: Melbourne, Feb-May 1999, Brisbane, Jun-Aug 1999,
Sydney, Sep-Nov 1999, Perth, Dec 1999-Feb 2000
cli...
2007 Feb 13
1
Questions about results from PCAproj for robust principal component analysis
...the standard deviations of the
components in
order by descending value; the squares are the eigenvalues of the
covariance matrix
- the matrix, loadings, has dimension CxC, and the columns are the
eigenvectors of the
covariance matrix, in the same order as the sdev vector; the columns are
orthonormal:
sum(dmpca$loadings[,i]*dmpca$loadings[,j]) = 1 if i == j, ~ 0 if i != j
- the vector, center, of length C, contains the means of the variable
columns in the original
data matrix, in the same order as the original columns
- the vector, scale, of length C, contains the scalings applied to eac...
2003 Jul 11
1
How to generate regression matrix with correlation matrix
Dear R community:
I want to simulate a regression matrix which is generated from an orthonormal matrix X of dimension 30*10 with different between-column pairwise correlation coefficients generated from uniform distribution U(-1,1).
Thanks in advance!
Rui
[[alternative HTML version deleted]]
More clear statement about the question of how to generate regression matrix with correlation matrix
2003 Jul 12
1
More clear statement about the question of how to generate regression matrix with correlation matrix
...o and the ridge in a simulation of a linear regression model of 30 observations and 10 regressors Y = beta0 + beta1*x1 + ... + beta10*x10 + epsilon, where epsilon follows a normal distribution with mean mu and standard deviation sigma. Ten regression matrices {X}m, m=1,...,10, are generated from an orthonormal matrix X of dimension 30*10 with different between-column pairwise correlation coefficients {rho}m generated from uniform distribution U(-1, 1).
Thanks in advance.
Rui
[[alternative HTML version deleted]]
2011 Dec 13
2
Inverse matrix using eigendecomposition
...es
E<-eigen(m, sym=TRUE)
Q<-E$vectors
V<-E$values
n<-nrow(m)
##normalize the eigenvectors
for(i in 1:n){
Q[,i]<-Q[,i]/sqrt(sum(Q[,i]^2))
}
##verify dot product of vectors are orthogonal
sum(Q[,1]*Q[,2])
sum(Q[,1]*Q[,3])
sum(Q[,2]*Q[,3])
##Begin creating QDQ^T matrix. Where Q are orthonormal eigenvectors, and D
is a diagonal matrix with 1/eigenvalues on the diagonal. and Q^T is the
transpose of Q.
R<-t(Q)
D<-mat.or.vec(n,n)
for(i in 1:n) {
D[i,i]<-1/V[i]
}
P<-Q*D*R
## P should be the inverse of the matrix m. Check using
solve(m)
## solve(m) does not equal P? Any...
2010 Jan 16
2
La.svd of a symmetric matrix
Dear R list users,
the singluar value decomposition of a symmetric matrix M is UDV^(T), where U = V.
La.svd(M) gives as output three elements: the diagonal of D and the two orthogonal matrices u and vt (which is already the transpose of v).
I noticed that the transpose of vt is not exactly u. Why is that?
thank you for your attention and your help
Stefano
AVVISO IMPORTANTE: Questo messaggio di
2004 Aug 26
1
Why terms are dropping out of an lm() model
...ental data d, which has
two numeric predictors, p1 and p2, and one numeric response, r. The aim
is to compare polynomial models in p1 and p2 up to third order. I don't
understand why lm() doesn't return coefficients for the p1^3 and p2^3
terms. Similar loss of terms happened when I tried orthonormal
polynomials to third order.
I'm satisfied with the second-order regression, by the way, but I'd
still like to understand why the third-order regression doesn't work
like I'd expect.
Can anyone offer a pointer to help me understand this?
Here's what I'm seeing in R 1.9.1...
2004 May 06
5
Orthogonal Polynomial Regression Parameter Estimation
Dear all,
Can any one tell me how can i perform Orthogonal
Polynomial Regression parameter estimation in R?
--------------------------------------------
Here is an "Orthogonal Polynomial" Regression problem
collected from Draper, Smith(1981), page 269. Note
that only value of alpha0 (intercept term) and signs
of each estimate match with the result obtained from
coef(orth.fit). What
2008 Jan 06
0
SVD least squares sub-space projection
...e first l columns of V, with gives a (l X l) matrix, i
know that i than have a sub-space (R^L)of the original (R^M) space. I
know that this sub-space basis is optimal in the least squares sense.
The question is: given one 3-dim space generated by 6 vectors (A is a
6X3 matrix), i define a 2-dim orthonormal basis by taking the 2 first
columns of V, how i can then project a new 3-dim vector in this 2-dim
sub-space just defined?
Thanks in advance.
Jos? Augusto M. de Andrade Jr.
Business Adm. Student
University of Sao Paulo - Brazil
2011 Jul 07
1
Polynomial fitting
Hello,
i'm fairly familiar with R and use it every now and then for math related
tasks.
I have a simple non polynomial function that i would like to approximate
with a polynomial. I already looked into poly, but was unable to understand
what to do with it. So my problem is this. I can generate virtually any
number of datapoints and would like to find the coeffs a1, a2, ... up to a
given
2013 Oct 11
3
Gaussian Quadrature for arbitrary PDF
Hi all,
We know that Hermite polynomial is for
Gaussian, Laguerre polynomial for Exponential
distribution, Legendre polynomial for uniform
distribution, Jacobi polynomial for Beta distribution. Does anyone know
which kind of polynomial deals with the log-normal, Studentæ¯ t, Inverse
gamma and Fisheræ¯ F distribution?
Thank you in advance!
David
[[alternative HTML version deleted]]
2010 Feb 08
3
Hypercube in R
Dear all,
Does anybody have an idea or suggestion how to construct (plot)
4-dimensional hypercube in R.
Thanks in advance for any pointers.
Regards, Andrej
2007 Jun 06
1
correspondence analysis
Hello,
I am new to R and I have a question about the difference between
correspondence analysis in R and SPSS.
This is the input table I am working with (4 products and 18 attributes):
> mytable
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
1 15 11 20 4 14 7 1 2 1 4 12 12 17 19 11 20 9 10
2 19 18 14 14 16 4 14 11 11 15 22 19 22 16 21 19 15 16
3 16 13 10 9 15 4 10 7 11 13 18
2005 Mar 14
1
r: eviews and r // eigen analysis
hi all
i have a question that about the eigen analysis found in R and in
eviews.
i used the same data set in the two packages and found different
answers. which is incorrect?
the data is:
aa ( a correlation matrix)
1 0.9801 0.9801 0.9801 0.9801
0.9801 1 0.9801 0.9801 0.9801
0.9801 0.9801 1 0.9801 0.9801
0.9801 0.9801 0.9801 1 0.9801
0.9801 0.9801 0.9801 0.9801 1
now
> svd(aa)
$d
[1] 4.9204
2013 Nov 28
2
Find the prediction or the fitted values for an lm model
Hi,
I would like to fit my data with a 4th order polynomial. Now I have only
5 data point, I should have a polynomial that exactly pass the five point
Then I would like to compute the "fitted" or "predict" value with a
relatively large x dataset. How can I do it?
BTW, I thought the model "prodfn" should pass by (0,0), but I just
wonder why the const is
2009 Sep 18
2
A stat related question
Can I ask a small stat. related question here?
Suppose I have two predictors for a time series processes and accuracy of
predictor is measured from MSEs. My question is, if two predictors give same
MSE then, necessarily they have to be identical? Can anyone provide me any
counter example?
Thanks.
--
View this message in context: