Dear All, I am working with skewed-t copula in my research recently, so I needed to write an mle procedure instead of using a standard fit one; I stick to the sn package. On subsamples of the entire population that I deal with, everything is fine. However, on the total sample (difference in cross-sectional dimension: 30 vs 240) things go wrong - the objective function diverges to infinity. I located the "rotten" line to be t1 <- dmst(vector, mu, P, alpha, nu) where "vector" is the matrix row, on which I evaluate my likelihood and the rest in parametrized in a standard way, just as the help pages give it. In large dimensions, I get a zero value of the density (which is probably due to numerical issues). I tried the following dummy example t1 <- rmst(1,mu,P,alpha, nu) t2 <- dmst(t1, mu, alpha,nu) and t2 remains to be zero. Can anyone help me on this one? thanks in advance, Konrad -- "We are what we pretend to be, so we must be careful about what we pretend to be" Kurt Vonnegut Jr. "Mother Night" [[alternative HTML version deleted]]
Try maximizing the log-likelihood and using the log=TRUE argument to dmst. (You have told us so little about what you are doing that we can but guess at what you mean by `write an mle procedure': what is wrong with st.mle, for example?) On Tue, 28 Mar 2006, Konrad Banachewicz wrote:> Dear All, > I am working with skewed-t copula in my research recently, so I needed to > write an mle > procedure instead of using a standard fit one; I stick to the sn package. On > subsamples of the entire population that I deal with, everything is fine. > However, on the total sample (difference in cross-sectional > dimension: 30 vs 240) things go wrong - the objective function diverges to > infinity. I located the "rotten" line > to be > > t1 <- dmst(vector, mu, P, alpha, nu) > > where "vector" is the matrix row, on which I evaluate my likelihood and the > rest in parametrized in a standard > way, just as the help pages give it. In large dimensions, I get a zero value > of the density (which is probably due to numerical issues). I tried the > following dummy example > > t1 <- rmst(1,mu,P,alpha, nu) > t2 <- dmst(t1, mu, alpha,nu) > > and t2 remains to be zero. Can anyone help me on this one? > > thanks in advance, > Konrad > > -- > "We are what we pretend to be, so we must be careful about what we pretend > to be" > > Kurt Vonnegut Jr. "Mother Night" > > [[alternative HTML version deleted]] > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html >-- Brian D. Ripley, ripley at stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax: +44 1865 272595
On Tue, 28 Mar 2006 11:41:19 +0200, Konrad Banachewicz wrote: please supply the ingredients needed to reproduce the problem that you have faced (including the values of the parameters mu,P,alpha,nu, among the rest) best wishes, Adelchi Azzalini KB> Dear All, KB> I am working with skewed-t copula in my research recently, so I KB> needed to write an mle KB> procedure instead of using a standard fit one; I stick to the sn KB> package. On subsamples of the entire population that I deal with, KB> everything is fine. However, on the total sample (difference in KB> cross-sectional dimension: 30 vs 240) things go wrong - the KB> objective function diverges to infinity. I located the "rotten" KB> line to be KB> KB> t1 <- dmst(vector, mu, P, alpha, nu) KB> KB> where "vector" is the matrix row, on which I evaluate my KB> likelihood and the rest in parametrized in a standard KB> way, just as the help pages give it. In large dimensions, I get a KB> zero value of the density (which is probably due to numerical KB> issues). I tried the following dummy example KB> KB> t1 <- rmst(1,mu,P,alpha, nu) KB> t2 <- dmst(t1, mu, alpha,nu) KB> KB> and t2 remains to be zero. Can anyone help me on this one? KB> KB> thanks in advance, KB> Konrad KB> KB> -- KB> "We are what we pretend to be, so we must be careful about what we KB> pretend to be" KB> KB> Kurt Vonnegut Jr. "Mother Night" KB> KB> [[alternative HTML version deleted]] KB> KB> ______________________________________________ KB> R-help at stat.math.ethz.ch mailing list KB> https://stat.ethz.ch/mailman/listinfo/r-help KB> PLEASE do read the posting guide! KB> http://www.R-project.org/posting-guide.html KB>