Displaying 20 results from an estimated 38 matches for "h_0".
Did you mean:
h0
2008 Aug 06
1
Variance-covariance matrix for parameter estimates
Dear All,
I am currently working with the coxph function within the package survival.
I have the model h_ij = h_0(t) exp(b1x1 + b2x2) where the indicator
variables
are as follows:
x1 x2
VPS 0 0
LTG 1 0
TPM 0 1
[[alternative HTML version deleted]]
2002 Aug 02
1
Cox regression
Hi!
I would like to do Cox regression using the available routines in
the survival package BUT I want to use an arbitrary link function, i.e.
want to use the model
h(t)=h_0(t)r(beta'z)
with arbitrary function r, instead of
h(t)=h_0(t)exp(beta'z)
Grateful for any comment on this,
Dragi
-----------------------------------------------------------
Dragi Anevski, PhD
Mathematical Statistics Chalmers University of Technology
G?teborg University SE-412 9...
2007 Aug 05
0
null hypothesis for two-way anova
Dear R community,
Confused by some of my lab results I ask for the definition of the null
hypothesis of a two-way analysis of variance in R (anova() and aov()).
Starting with the following model
y = a_i + b_j , i in A and j in B
is the tested null hypothesis
H_0: a_i = 0 for all i in A
or
H_0: a_m = a_n for any m and n in A?
Consequently the same questions for interaction effects. Starting with
the model
y = a_i + b_j + f_ij , i in A and j in B
is the tested null hypothesis
H_0: f_ij = 0 for all i in A and j in B
or
H_0: f_ij = f_mn for any i and...
2003 Aug 28
2
ks.test()
Dear All
I am trying to replicate a numerical application (not computed on R) from an
article. Using, ks.test() I computed the exact D value shown in the article
but the p-values I obtain are quite different from the one shown in the
article.
The tests are performed on a sample of 37 values (please see "[0] DATA"
below) for truncated Exponential, Pareto and truncated LogNormal
2006 Nov 08
2
interprete wilcox.test results
Dear All,
I am using wilcox.test to test two samples, data_a and data_b, earch sample has 3 replicates, suppose data_a and data_b are 20*3 matrix. Then I used the following to test the null hypothesis (they are from same distribution.):
wilcox.test(x=data_a, y=data_b, alternative="g")
I got pvalue = 1.90806170863311e-09.
When I switched data_a and data_b by doing the following:
2012 May 02
1
coxph reference hazard rate
Hi,
In the following results I interpret exp(coef) as the factor that multiplies
the base hazard rate if the corresponding variable is TRUE. For example,
when the bucket is ks008 and fidelity <= 3, then the rate, compared to the
base rate h_0(t), is h(t) = 0.200 h_0(t). My question is then, to what case
does the base hazard rate correspond to? I would expect the reference to be
the first factor value, i.e. bucket jpc001 with fidelity <= 3, but its
exp(coef) is not one. I verified the contrasts, and the row corresponding to
the first...
2012 Jul 06
1
How to compute hazard function using coxph.object
...L.COXPH)
# Obtaining the object "surv" from the survfit object
MODEL.COXPH.SURVFIT.SURV <- MODEL.COXPH.SURVFIT$surv
# Computing the hazard function ## ## I am pretty confuse in this;
whether result from this, is a hazard function(H(t)) or baseline cumulative
hazard function(H_0(t)) ##
Hazard.Function <- -log(MODEL.COXPH.SURVFIT.SURV)
################################################
################################################
# Baseline hazard function can also be calculated explicitly ##
MODEL.COXPH.BASELINEHAZ <- basehaz(MODEL....
2002 Mar 26
3
ks.test - continuous vs discrete
I frequently want to test for differences between animal size frequency
distributions. The obvious test (I think) to use is the Kolmogorov-Smirnov
two sample test (provided in R as the function ks.test in package ctest).
The KS test is for continuous variables and this obviously includes length,
weight etc. However, limitations in measuring (e.g length to the nearest
cm/mm, weight to the nearest
2008 Aug 22
0
Re : Help on competing risk package cmprsk with time dependent covariate
...assumed as being time dependent in "fg2" ?
Question 2: both summaries give me the following that I dont understand at all, is there a mistake in my script ?
Competing risks Model
Test for nonparametric terms
Test for non-significant effects
sup| hat B(t)/SD(t) | p-value H_0: B(t)=0
(Intercept) 0 0
random 0 0
Test for time invariant effects
supremum test p-value H_0: B(t)=b t
(Intercept) 0 0
random 0 0...
2006 Oct 23
0
likelihood question not so related to R but probably requires the use of R
I have a question and it's only relation to R is that I probably need R
after I understand what to do.
Both models are delta y_t = Beta + epslion
and suppose I have a null hypothesis and alternative hypothesis
H_0 : delta y_t = zero + epsilon epsilon is normal ( 0,
sigmazero^2 )
H_1 delta y_t = beta + epsilon epsilon is normal (
sigmabeta^2 )
------------------------------------------------------------------------
----------------------------------------------------...
2003 Sep 15
1
question regarding ks.test()
...on?
Am I correct in assuming that smaller D values indicate that they come
from the same distribution? In addition how can I use the p value that
is supplied in the output?
In my code I decided (as described by Conover) by calculating the
1-alpha quantile and if D was greater than this value, the H_0 is
rejected. However I dont calculate a P value. Is this method
significantly different from the method used in R?
Since I get the same value of D in both methods is there any reason to
prefer one over the other?
Thanks,
-------------------------------------------------------------------
Rajar...
2009 May 31
1
Bug in truncgof package?
...2
0.21 0.22 0.26 0.27 0.28 0.3 0.31 0.32 0.33 0.36 0.38 0.4 0.44 0.49
0.54 0.55
2 2 1 3 1 2 1 1 1 2 1 2 1 1
2 1
0.56 0.57 0.62 0.7 0.76 0.78 0.96 0.98
1 2 1 1 1 1 1 1
This is, in a 45% of the cases, you would reject the H_0 hypothesis,
which happens to be true, at the 5% "standard" confidence level.
Do you think this behaviour is buggy? If so, given that the maintainer
does not seem to be contactable, what would be the next step to take?
Best regards,
Carlos J. Gil Bellosta
http://www.datanalytics.com
2005 Jun 10
1
Estimate of baseline hazard in survival
Dear All,
I'm having just a little terminology problem, relating the language used in
the Hosmer and Lemeshow text on Applied Survival Analysis to that of the
help that comes with the survival package.
I am trying to back out the values for the baseline hazard, h_o(t_i), for
each event time or observation time.
Now survfit(fit)$surv gives me the value of the survival function,
S(t_i|X_i,B),
2006 May 21
3
normality testing with nortest
I don't know from the nortest package, but it should ***always***
be the case that you test hypotheses
H_0: The data have a normal distribution.
vs.
H_a: The data do not have a normal distribution.
So if you get a p-value < 0.05 you can say that
***there is evidence***
(at the 0.05 significance level) that the data are not from a
normal distribution.
If the nortest package does it differentl...
2010 Nov 13
2
interpretation of coefficients in survreg AND obtaining the hazard function for an individual given a set of predictors
...y h_i(t) =
exp(\beta'x_i)*\lambda*\gamma*t^{\gamma-1}, it doesn't look like to me that
predict(mwa, type='linear') is \beta'x_i.
b) since I need the coefficient intercept from the model to obtain the scale
parameter to obtain the base hazard function as defined in Collett
(h_0(t)=\lambda*\gamma*t^{\gamma-1}), I am concerned that this coefficient
intercept changes depending on the reference level of the factor entered in the
model. The change is very important when I have more than one predictor in the
model.
Any help would be greatly appreciated,
David Biau....
2006 Nov 07
1
gamm(): nested tensor product smooths
...mixed model representation, where X represents the unpenalized part of the spline functions and Z the "wiggly" parts, this would be:
y=X%*%beta+ Z_1%*%b_1+ Z_2%*%b_2
vs
y=X%*%beta+ Z_1%*%b_1+ Z_2%*%b_2 + Z_12 %*% b_12
where b are random effect vectors and the hypothesis to be tested is
H_0: Var(b_12)=0 (<=> b_12_i == 0 for all i)
the problem:
gamm() does not seem to support the use of nested tensor product splines,
does anybody know how to work around this?
example code: (you'll need to source the P-spline constructor from ?p.spline beforehand)
###########
test1<-fun...
2010 Nov 15
1
interpretation of coefficients in survreg AND obtaining the hazard function
...y h_i(t) =
exp(\beta'x_i)*\lambda*\gamma*t^{\gamma-1}, it doesn't look like to me
that
predict(mwa, type='linear') is \beta'x_i.
b) since I need the coefficient intercept from the model to obtain the
scale
parameter to obtain the base hazard function as defined in Collett
(h_0(t)=\lambda*\gamma*t^{\gamma-1}), I am concerned that this
coefficient
intercept changes depending on the reference level of the factor entered
in the
model. The change is very important when I have more than one predictor
in the
model.
Any help would be greatly appreciated,
David Biau.
2010 Nov 16
1
Re : interpretation of coefficients in survreg AND obtaining the hazard function for an individual given a set of predictors
..._i(t) = F_0( exp((t+beta'x_i)/scale) )
So you need to multiply by the scale parameter and change sign to get
the log hazard ratios.
> b) since I need the coefficient intercept from the model to obtain the scale
> parameter to obtain the base hazard function as defined in Collett
> (h_0(t)=\lambda*\gamma*t^{\gamma-1}), I am concerned that this coefficient
> intercept changes depending on the reference level of the factor entered in the
> model. The change is very important when I have more than one predictor in the
> model.
As Terry Therneau pointed out recently in the c...
2003 Feb 17
0
Re: R-help digest, Vol 1 #80 - 14 msgs
...ng for the effect of
> some explanatory variable in arima models.
> I performed three different simulations
> y<-2+arima.sim(n, model=list(ar=ar.p,ma=0),sd=1)
> with different ar parameter, namely 0.7, 0.5, and 0.2. n=100.
>
> Out of 1000 replications performed for each ar, the H_0 (no effect of
> x<-1:n) was rejected 77, 67 and 53 times with ar=0.7, 0.5 and 0.2
> respectively. (sigma was assumed known in calculating the statistic test and
> the nominal significance level is 0.05)
>
> That is, the LRT seems to become somewhat anti-conservative when the ar
&g...
2003 Jul 14
1
gam and step
hello,
I am looking for a step() function for GAM's.
In the book Statistical Computing by Crawley and a removal of predictors has
been done "by hand"
model <- gam(y ~s(x1) +s(x2) + s(x3))
summary(model)
model2 <- gam(y ~s(x2) + s(x3)) # removal of the unsignificant variable
#then comparing these two models if an significant increase occurs.
anova(model, model2,