Displaying 20 results from an estimated 3000 matches similar to: "spurious regression in R"
2006 Nov 06
1
question about function "gls" in library "nlme"
Hi:
The gls function I used in my code is the following
fm<-gls(y~x,correlation=corARMA(p=2) )
My question is how to extact the AR(2) parameters from "fm".
The object "fm" is the following. How can I extract the correlation parameters
Phi1 and Phi2 from "fm"? These two parametrs is not in the "coef" componenet of "fm".
Thanks a
2006 Feb 15
1
question about the results given by the Box.test?
Hello, I am using the Ljung Box test in R to compute
if the resiudals of my fitted model is random or not.
I am not sure though what the results mean, I have
looked at various sources on the internet and have
come up with contrasting explanations (mainly because
these info deal with different program languages, like
SAS, SPSS, etc).
I know that my residuals should appropriate white
noise( is
2004 Apr 17
3
Box-Ljung p-value -> Test for Independence
Hi all
I'm using the Box-Ljung test (from within R) to test if a time-series in
independently distributed.
2 questions:
1) p-value returned by Box-Ljung:
IF I want to test if the time-series is independant at say 0.05
sig-level (it means that prob of erroneously accepting that the
time-series is independent is 0.05 right?)
--> then do I consider time-series as "independant"
2008 May 16
2
Box.test degrees of freedom
Dear colleagues,
I am new to R and statistics so please keep that in mind.
I have doubts on the df calculation of Ljung-Box test (Box.test). The
function seems to use always the df=lag=m and not df=m-p-q like suggested in
Ljung and Box (1978) paper (that is referenced).
Do you agree with this? If so, is there an R package function that computes
Ljung-Box test with the degrees of
2006 Jul 01
1
polynomial expansion in R
Hi:
I have two vectors of data, x and y and I want to get the "polynomial" expansion of (x+y)^p with any integer power p in R. Suppose p=2, then I want a matrix of five vectors, namely, x y x^2 y^2 x*y. The coefficient of the polynomial is not needed. I can write it manully if p is small. But I want it in the case of p=10 or even bigger, is there any function in R can do that
2011 Aug 27
1
Degrees of freedom in the Ljung-Box test
Dear list members,
I have 982 quotations of a given stock index and I want to run a Ljung-Box
test on these data to test for autocorrelation. Later on I will estimate 8
coefficients.
I do not know how many degrees of freedom should I assume in the formula for
Ljung-Box test. Could anyone tell me please?
Below the formula:
Box.test(x, lag = ????, type = c("Ljung-Box"), fitdf = 0)
2002 Mar 08
1
Matrix multiplication problem
Dear List,
I am having trouble with some R code I have written to perform
Redundancy Analysis (RDA) on a matrix of species abundance data (Y) and
a matrix of environmental data (X).
RDA is a constrained form of PCA and can be thought of as a PCA of the
fitted values of a regression of each variable in Y on all variables in
X.
For info, the first use of RDA is in:
Rao, C.R, 1964. The use and
2009 Feb 24
1
Box.test reference correction (PR#13554)
Full_Name: Peter Solymos
Version: 2.8.1
OS: Windows
Submission from: (NULL) (129.128.141.92)
The help page of the Box.test function (stats) states that the Ljung-Box test
was published in:
Ljung, G. M. and Box, G. E. P. (1978), On a measure of lack of fit in time
series models. Biometrika 65, 553--564.
The page numbers are incorrect. The correct citation should be as follows:
Ljung, G. M.
2006 Nov 22
1
question about the "solve" function in library "Matrix"
Hi:
I have some problems when I use the function "solve" function in a loop. In the following code, I have a diagonal martix "ttt" whose elements change in every iteration in a loop. I defined a "dpoMatrix"class before the loop so I do not need to define this class every time in the loop. The reason is to save some computing time. The code is below. The inverse
2011 May 08
1
Hosmer-Lemeshow 'goodness of fit'
I'm trying to do a Hosmer-Lemeshow 'goodness of fit' test on my logistic
regression model.
I found some code here:
http://sas-and-r.blogspot.com/2010/09/example-87-hosmer-and-lemeshow-goodness.html
The R code is above is a little complicated for me but I'm having trouble
with my answer:
Hosmer-Lemeshow: p=0.6163585
le Cessie and Houwelingen test (Design library): p=0.2843620
2012 Jun 26
2
Ljung-Box test (Box.test)
I fit a simple linear model y = bX to a data set today, and that produced 24 residuals (I have 24 data points, one for each year from 1984-2007). I would like to test the time-independence of the residuals of my model, and I was recommended by my supervisor to use the Ljung-Box test. The Box.test function in R takes 4 arguments:
x a numeric vector or univariate time series.
lag the statistic
2010 Jan 16
2
predict.glm
Hi,
See below I reply your message for <https://stat.ethz.ch/pipermail/r-help/2008-April/160966.html>[R] predict.glm & newdata posted on Fri Apr 4 21:02:24 CEST 2008
You say it ##works fine but it does not: if you look at the length of yhat2, you will find 100 and not 200 as expected. In fact predict(reg1, data=x2) gives the same results as predict(reg1).
So I am still looking for
2001 Apr 27
3
nls question
I have a question about passing arguments to the function f that nlm
minimizes.
I have no problems if I do this:
x<-seq(0,1,.1)
y<-1.1*x + (1-1.1) + rnorm(length(x),0,.1)
fn<-function(p)
{
yhat<-p*x+(1-p)
sum((y-yhat)^2)
}
out<-nlm(fn,p=1.5,hessian=TRUE)
But I would like to define
fn<-function(x,y,p)
{
yhat<-p*x+(1-p)
sum((y-yhat)^2)
}
so
2012 Mar 08
1
sas retain statement in R or fitting differene equations in NLS
I wish to fit a dynamical model in R and I am running in a problem that
requires some of your wisdom to solve. For SAS users I am searching for
the equivalent of the */retain/ *statement.
For people that want to read complicated explanations to help me:
I have a system of two equations written as difference equations here.
To boil it down. I have a dataframe with three variables y, X1, X2
2010 Feb 13
2
lm function in R
Hello,
I am trying to learn how to perform Multiple Regression Analysis in R. I
decided to take a simple example given in this PDF:
http://www.utdallas.edu/~herve/abdi-prc-pretty.pdf
I created a small CSV called, students.csv that contains the following data:
s1 14 4 1
s2 23 4 2
s3 30 7 2
s4 50 7 4
s5 39 10 3
s6 67 10 6
Col headers: Student id, Memory span(Y), age(X1), speech rate(X2)
Now
2006 Mar 08
1
Degrees of freedom using Box.test()
After an RSiteSeach("Box.test") I found some discussion regarding the degrees
of freedom in the computation of the Ljung-Box test using Box.test(), but did
not find any posting about the proper degrees of freedom.
Box.test() uses "lag=number" as the degrees of freedom. However, I believe
the correct degrees of freedom should be "number-p-q" where p and q are
2008 Sep 16
2
Hosmer- Lemeshow test
Dear R - help,
I am working on the Credit scorecard model. I am using the Logistic regression to arrive at the regression coefficients model.
I want to use the Hosmer - Lemeshow test .
In order to understand the use of R - language, I had referred the following URL
http://www.stat.sc.edu/~hitchcock/diseaseoutbreakRexample704.txt
The related data 'diseaseoutbreak' is available
2007 Apr 25
1
Box Ljung Statistics
Hi All R Experts,
I met with below mentioned statistics in paper "Stock Index Volatility
Forecasting with High Frequency Data"
by Eugenie Hol, Siem Jan Koopman
http://ideas.repec.org/p/dgr/uvatin/20020068.html
I would like to ask that what is "Box-Ljung portmantacau statistic based
on N squared autocorrelation" ?
Is it same as "Box-Ljung Statistics" of stats
2006 Apr 01
1
Nested error structure in nonlinear model
I am trying to fit a nonlinear regression model to data. There are
several predictor variables and 8 parameters. I will write the model as
Y ~ Yhat(theta1,...,theta8)
OK, I can do this using nls() - but "only just" as there are not as many
observations as might be desired.
Now the problem is that we have a factor "Site" and I want to include a
corresponding error
2013 Apr 23
1
Hosmer Lemeshow test
Hi to everybody. I use the following routine (i found it in the internet)
to compute the Hosmer-Lemeshow test in the framework of logistic regression.
hosmerlemeshow = function(obj, g=10) {
# first, check to see if we fed in the right kind of object
stopifnot(family(obj)$family=="binomial" && family(obj)$link=="logit")
y = obj$model[[1]]
# the double bracket