I looked at your data:
> table(x, cluster)
1 2 3 4 5 6
0 0 48 0 48 48 0
1 48 0 48 0 0 48
Your covariate "x" is perfectly predicted by the cluster variable.
If you fit a fixed effects model:
coxph(Surv(time, event) ~ factor(cluster) +x)
then the "x" variable is declared redundant. When the variance of the
random effect is
sufficiently large, the same happens in the gamma model when the variance is
sufficiently
large. Your model approaches this limit, and the solution fails. As mentioned
in the
manual page, the coxme function is now preferred.
Last, your particular error message is caused by an invalid value for
"sparse". I'll add
a check to the program.
You likely want "sparse=10" to force non-sparse computation.
Terry Therneau
On 12/04/2012 05:00 AM, r-help-request at r-project.org
wrote:> Dear all,
>
> I have a data
set<http://yaap.it/paste/c11b9fdcfd68d02b#gIVtLrrme3MaiQd9hHy1zcTjRq7VsVQ8eAZ2fol1lUc=>with
> 6 clusters, each containing 48 (possibly censored, in which case
> "event = 0") survival times. The "x" column contains a
binary explanatory
> variable. I try to describe that data with a gamma frailty model as
follows:
>
> library(survival)
>
> mod<- coxph(Surv(time, event) ~
> x + frailty.gamma(cluster, eps=1e-10, method="em", sparse=0),
> outer.max=1000, iter.max=10000,
> data=data)
>
> Here is the error message:
>
> Error in if (history[2, 3]< (history[1, 3] + 1)) theta<-
> mean(history[1:2, :
> missing value where TRUE/FALSE needed
>
> Does anyone have an idea on how to debug?
>
> Yours sincerely,
> Marco