Displaying 8 results from an estimated 8 matches for "reestimated".
Did you mean:
reestimate
2005 Jan 20
3
Constructing Matrices
Dear List:
I am working to construct a matrix of a particular form. For the most
part, developing the matrix is simple and is built as follows:
vl.mat<-matrix(c(0,0,0,0,0,64,0,0,0,0,64,0,0,0,0,64),nc=4)
Now to expand this matrix to be block-diagonal, I do the following:
sample.size <- 100 # number of individual students
I<- diag(sample.size)
bd.mat<-kronecker(I,vl.mat)
This
2009 Jun 07
1
One rather theoretical question about fitting algorithm
Hi,
What I'm trying to achieve is very fast algorithm for fitting logistic
regression model. I have to estimate regression coeficients using
about 10k observations. Once I have coefficients estimated, new 100
rows of data becomes available.... Now I need to reestimate
coeficients using 100 newly arrived observations and removing 100
oldest observations.
So, my question is would it be
2004 Dec 29
3
gls model and matrix operations
Dear List:
I am estimating a gls model and am having to make some rather unconventional modifications to handle a particular problem I have identified. My aim is to fit a GLS with an AR1 structure, obtain the variance-covariance matrix (V), modify it as needed given my research problem, and then reestimate the GLS by brute force using matrix operations. All seems to be working almost perfectly,
2005 May 25
1
[Fwd: Re: [Fwd: failure delivery]]
I appear to have hit one of the "drop" issues raised in some discussions
a couple of years ago by Frank Harrell. They don't seem to have been
fixed, and I'm under some pressure to get a quick solution for a
forecasting task I'm doing.
I have been modelling some retail sales data, and the days just after
Thanksgiving (US version!) are important. So I created some dummy
2011 Oct 01
7
Poor performance of "Optim"
I used to consider using R and "Optim" to replace my commercial packages:
Gauss and Matlab. But it turns out that "Optim" does not converge
completely. The same data for Gauss and Matlab are converged very well. I
see that there are too many packages based on "optim" and really doubt if
they can be trusted!
--
View this message in context:
2010 Nov 17
0
Reference classes: opinion on OOP design
...d dispatch etc.) as possible. I do want to have
the choice if I'm carrying (possibly loads of) data with me in the fields of
my object or if I compute/get them based on some function whenever I
actually need it. For example, I'm thinking about parameter estimates that
could automatically be reestimated based on rules that take into account a
constantly changing data structure (new observations come it quite
frequently or something like that).
What do you think of the way I've implemented this? Does this make sense to
you or is something like this done in another (and probably more elegant
;...
2010 Nov 17
0
WG: Reference classes: opinion on OOP design
...ethod dispatch etc.) as possible. I do want to have
the choice if I?m carrying (possibly loads of) data with me in the fields of
my object or if I compute/get them based on some function whenever I
actually need it. For example, I'm thinking about parameter estimates that
could automatically be reestimated based on rules that take into account a
constantly changing data structure (new observations come it quite
frequently or something like that).
What do you think of the way I?ve implemented this? Does this make sense to
you or is something like this done in another (and probably more elegant
;-))...
2004 Sep 05
1
Question to NLME, ML vs. REML
Dear all,
I am planning to use nlme library for analysis of experiments in semiconductor
industry. Currently I am using "lm" but plan to move to "lme" to handle
within wafer / wafer-to-wafer and lot-to-lot variation correctly.
So far everything is working well, but I have a fundamentel question:
NLME offers "maximum likelihood" and "restricted maximum