hi all
HOPE SOMEONE CAN HELP!
i 've been searching r and some of the archives in order to find out how
one can consistently estimate the degrees of freedom of the following
random variable:
Y = a*T(v)+b
a and b = constants
T(v) is a Students t distribution with v degrees of freedom
i found one posting in 2001 but no answer!
i know that this problem can easily be solved using MLE (numerically)
but i am interested in the method of moment estimates.
i derived the estimators and then simulated 10 000 times from the
distribution in order to evaluate the efficiency of the estimators.
the parameters used are:
a=0.1
b=0
v=9.5
and the simulation results are:
mean(a)=0.10017
mean(b)=6.4202e-06
mean(v)=9.780
sd(a)=0.0016
sd(b)=1.1334e-03
sd(v)=1.1681
the sample size used is also 10 000!
from the above results it seems as if "a" and "b" is being
estimated
consistently but "v" is not (i.e. the mean of the different paameter
estimates is not equal to 9.5)! i know that a confidence interval about
the estimate does contain 9.5 but the sd of the "v" estimate seems to
big!
QUESTION:
#########################################################
is this because the number of simulations is to small and that the
asymptotic results for "v" does not yet hold?
i will run the simulations 100 000 times and report the results later.
hopefully someone can shed some insight on this problem.
/
allan