Displaying 14 results from an estimated 14 matches for "lordpreetam".
2013 Apr 25
2
Selecting and then joining data blocks
Hi all,
I have 4 matrices, each having 5 columns and 4 rows .....denoted by
B1,B2,B3,B4.
I have generated a vector of 7 indices, say (1,2,4,3,2,3,1} which refers to
the index of the matrices to be chosen and then appended one on the top of
the next: like, in this case, I wish to have the following mega matrix:
B1over B2 over B4 over B3 over B2 over B3 over B1.
1> How can I achieve this?
2013 Apr 25
1
Bootstrapping in R
Hi all,
1>i have 3 vectors a,b and c, each of length 25....... i want to define a
new data frame z such that z[1] = (a[1] b[1] c[1]), z[2] = (a[2] b[2] c[2])
and so on...how do i do it in R
2> Then i want to draw bootstrap samples from z.
Kindly suggest how i can do this in R.
Thanks,
Preetam
--
Preetam Pal
(+91)-9432212774
M-Stat 2nd Year,
2013 Apr 30
1
ADF test --time series
Hi all,
I was running the adf test in R.
CODE 1:
adf.test(data$LOSS)
Augmented Dickey-Fuller Test
data: data$LOSS
Dickey-Fuller = -1.9864, Lag order = 2, p-value = 0.5775
alternative hypothesis: stationary
CODE 2:
adf.test(diff(diff(data$LOSS)))
Augmented Dickey-Fuller Test
data: diff(diff(data$LOSS))
Dickey-Fuller = -6.9287, Lag order = 2, p-value = 0.01
alternative
2013 Apr 26
1
Regression coefficients
Hi all,
I have run a ridge regression as follows:
reg=lm.ridge(final$l~final$lag1+final$lag2+final$g+final$g+final$u,
lambda=seq(0,10,0.01))
Then I enter :
select(reg) and it returns: modified HKB estimator is 19.3409
modified L-W estimator is 36.18617
smallest value of GCV at 10
I think it means that it is
2013 May 02
2
saving a matrix
Hi all,
In my data analysis,
I have created a random matrix M ( of order 500 X 7).
I want to use the same matrix when I start a new session, or suppose I want
to send this matrix to one of my friends (because this matrix is randomly
generated, and I dont want to use any other 500X7 matrix randomly generated
by R).
How can I save and call this matrix in the later sessions as well?
Appreciate
2013 May 02
2
ARMA with other regressor variables
Hi,
I want to fit the following model to my data:
Y_t= a+bY_(t-1)+cY_(t-2) + Z_t +Z_(t-1) + Z_(t-2) + X_t + M_t
i.e. it is an ARMA(2,2) with some additional regressors X and M.
[Z_t's are the white noise variables]
How do I find the estimates of the coefficients in R?
And also I would like to know what technique R employs to find the
estimates?
Any help is appreciated.
Thanks,
2013 Apr 29
1
Arma - estimate of variance of white noise variables
Hi all,
Suppose I am fitting an arma(p,q) model to a time series y_t.
So, my model should contain (q+1) white noise variables.
As far as I know, each of them should have the same variance.
How do I get the estimate of this variance by running the arma(y) function
(or is there any other way)?
Appreciate your help.
Thanks,
Preetam
--
Preetam Pal
(+91)-9432212774
M-Stat 2nd Year,
2013 Apr 30
0
Ridge regression
Hi all,
I have run a ridge regression on a data set 'final' as follows:
reg=lm.ridge(final$l~final$lag1+final$lag2+final$g+final$u,
lambda=seq(0,10,0.01))
Then I enter :
select(reg) and it returns: modified HKB estimator is 19.3409
modified L-W estimator is 36.18617
smallest value of GCV at 10
I think it
2013 May 03
1
Likelihood
Hi all,
I have run a regression and want to calculate the likelihood of obtaining
the sample.
Is there a way in which I can use R to get this likelihood value?
Appreciate your help on this.
The following are the details:
raw_ols1=lm(data$LOSS~data$GDP+data$HPI+data$UE)
summary(raw_ols1)
Call:
lm(formula = data$LOSS ~ data$GDP + data$HPI + data$UE)
Residuals:
Min 1Q
2013 May 09
0
ARMA(p,q) prediction with pre-determined coefficients
I have the following time series model for prediction purposes
*Loss_t = b1* Loss_(t-1) + b2*GDP_t + b3*W_(t-1)* where W_t is the
usual white noise variable.
So this is similar to ARMA(1,1) except that it also contains an extra
predictor, GDP at time t.
I have only 20 observations on each variable except GDP for which I know
till 100 values.
And most importantly,I have also calculated
2016 Apr 30
1
Declaring All Variables as Factors in GLM()
Hi guys,
I am running glm(y~., data = history,family=binomial)-essentially, logistic
regression for credit scoring (y = 0 or 1). The dataset 'history' has 14
variables, a few examples:
history <- read.csv("history.csv". header = TRUE)
1> 'income = 100,200,300 (these are numbers in my dataset; however
interpretation is that these are just tags or labels,for every
2013 Apr 27
1
Selecting ridge regression coefficients for minimum GCV
Hi all,
I have run a ridge regression as follows:
reg=lm.ridge(final$l~final$lag1+final$lag2+final$g+final$u,
lambda=seq(0,10,0.01))
Then I enter :
select(reg) and it returns: modified HKB estimator is 19.3409
modified L-W estimator is 36.18617
smallest value of GCV at 10
I think it means that it is advisable to
2013 May 02
1
warnings in ARMA with other regressor variables
Hi all,
I want to fit the following model to my data:
Y_t= a+bY_(t-1)+cY_(t-2) + Z_t +Z_(t-1) + Z_(t-2) + X_t + M_t
i.e. it is an ARMA(2,2) with some additional regressors X and M.
[Z_t's are the white noise variables]
So, I run the following code:
for (i in 1:rep) { index=sample(4,15,replace=T)
final<-do.call(rbind,lapply(index,function(i)
2013 May 04
2
Lasso Regression error
Hi all,
I have a data set containing variables LOSS, GDP, HPI and UE.
(I have attached it in case it is required).
Having renamed the variables as l,g,h and u, I wish to run a Lasso
Regression with l as the dependent variable and all the other 3 as the
independent variables.
data=read.table("data.txt", header=T)
l=data$LOSS
h=data$HPI
u=data$UE
g=data$GDP
matrix=data.frame(l,g,h,u)