similar to: Making Gehan-Breslow test for Survival data

Displaying 20 results from an estimated 900 matches similar to: "Making Gehan-Breslow test for Survival data"

2008 Apr 21
1
Analysis of Epidemiological Data Using R
Hi everyone, I'm studying the manual name: Analysis of Epidemiological Data Using R and Epicalc, maked by: Virasakdi Chongsuvivatwong and Edward McNeil. And I can't find the data base that they use in some examples, this are the names: Chapter7.Rdata,Chapter8.Rdata,Chapter9.Rdata Somebody can tell me how can I have this files? Thk! Jos? O__ ---- Jos? Bustos M. c/ /'_ ---
2008 Jul 03
1
R-help Digest, Vol 65, Issue 4
Hi everyone, We are looking for some data sets working with relative risk mortality. so, someone know where can I find the data.mgus dataset and the data.mgus? Using 1384 records from Minnesota. Thank! O__ ---- Jos? Bustos M. c/ /'_ --- Master Apllied Stat Program (*) \(*) -- University of Concepci?n
2006 Mar 07
1
breslow estimator for cumulative hazard function
Dear R-users, I am checking the proportional hazard assumption of a cox model for a given covariate, let say Z1, after adjusting for other relavent covariates in the model. To this end, I fitted cox model stratified on the discrete values of Z1 and try to get beslow estimator for the baseline cumulative hazard function (H(t)) in each stratum. As far as i know, if the proportionality assumption
2012 Aug 31
2
test Breslow-Day for svytable??
Hi all, I want to know how to perform the test Breslow-Day test for homogeneity of odds ratios (OR) stratified for svytable. This test is obtained with the following code: epi.2by2 (dat = daty, method = "case.control" conf.level = 0.95, units = 100, homogeneity = "breslow.day", verbose = TRUE) where "daty" is the object type table svytable consider it, but
2005 Jan 11
2
Breslow Day Test
Breslow-Day test A statistical test for the homogeneity of odds ratios. Homogeneity In <javascript:void(0);> systematic reviews homogeneity refers to the degree to which the results of studies included in a review are similar. "Clinical homogeneity" means that, in studies included in a review, the participants, interventions and outcome measures are similar or comparable.
2013 Apr 15
2
Convert results from print(survfit(formula, ...)) into a matrix or data frame
Hello All, Below is some sample survival analysis code. I'd like to able to get the results from print(gehan.surv) into a matrix or data frame, so I can manipulate them and then create a table using odfWeave. Trouble is, I'm not quite sure how make such a conversion using the results from a print method. Is there some simple way of doing this? Thanks, Paul require(survival)
2005 Oct 18
4
Efficient ways of finding functions and Breslow-Day test for homogeneity of the odds ratio
Dear all, I have been trying to find a function to calculate the Breslow-Day test for homogeneity of the odds ratio in R. I know the test can be preformed in SAS but i was wondering if anyone could help me to perform this in r. In addition i have the fullrefman file to search for functions in the basic R packages, does anyone have any suggestions of an efficient way of searching for
2002 Dec 17
1
Breslow Day Test
Hello everyone, Does anyone know if I can do Breslow Day Test for the homogeneity of odds ratio in R? Thanks! - Jacqueline
2010 Nov 16
1
Breslow-Day test
Dear R Users, I'm looking for a package that allows to test hypothesis about a homogeneity of odds ratio in k 2x2 tables. I know that Breslow-Day is suitable but does anybody could me point out a package? I found diffR, but as far as I see this package is for IRT theory. Best, Robert
2008 May 05
1
proportional test on epicalc library vs. Jerrold H. Zar.
Hi everyone, I'm working with the Epical library, specicatly using the power test in proportions. I think this test is not working like in the book: Biostatistical Analysis (4th Edition): Jerrold H. Zar In the example 23.25. (I attach this Pic) It's not the same answer. Using the follow command don't give the same answer. library(epicalc) power.for.2p(0.75, 0.50, 50, 45, alpha =
2008 Jul 03
2
Relative Mortality Risk second part
Hi everyone, We are looking for some data sets working with relative risk mortality. so, someone know where can I find the data.mgus dataset and the data.mgus? Using 1384 records from Minnesota. This data set are used in the : Robert A. Kyle, Terry M. Therneau, S. Vincent Rajkumar, Janice R. Offord, Dirk R. Larson, Matthew F. Plevak, and L. Joseph Melton III. A long-term study of prognosis in
2018 Feb 14
2
Fleming-Harrington weighted log rank test
Hi all,? The survdiff() from survival package has an argument "rho" that implements Fleming-Harrington weighted long rank test.? But according to several sources including "survminer" package (https://cran.r-project.org/web/packages/survminer/vignettes/Specifiying_weights_in_log-rank_comparisons.html), Fleming-Harrington weighted log-rank test should have 2 parameters
2009 Mar 10
2
simple question beginner
  Hi there,   I am beginner in R and I have some basic question. Suppose I run a common procedure such as a t test or cox model like below:   out<-coxph( Surv(tstart,tstop, death1) ~ x1+x1:log(tstop+1) , test1,method=c("breslow"))    Which yields the following result:   Call: coxph(formula = Surv(tstart, tstop, death1) ~ x1 + x1:log(tstop +     1), data = test1, method =
2018 Feb 15
0
Fleming-Harrington weighted log rank test
> On Feb 13, 2018, at 4:02 PM, array chip via R-help <r-help at r-project.org> wrote: > > Hi all, > > The survdiff() from survival package has an argument "rho" that implements Fleming-Harrington weighted long rank test. > > But according to several sources including "survminer" package
2013 Apr 11
3
odfWeave: Some questions about potential formatting options
Hello All, Learning to use the odfWeave package. I really like the package. It has good documentation, makes some very nice looking tables, and seems to have lots of options for customizing output. There are a few things I'd like to do that don't seem to be covered in the documentation though. So I'm not sure if they're possible or not. Here's a list of some things I'd
2018 Feb 15
1
Fleming-Harrington weighted log rank test
> On Feb 14, 2018, at 5:26 PM, David Winsemius <dwinsemius at comcast.net> wrote: > >> >> On Feb 13, 2018, at 4:02 PM, array chip via R-help <r-help at r-project.org> wrote: >> >> Hi all, >> >> The survdiff() from survival package has an argument "rho" that implements Fleming-Harrington weighted long rank test. >>
2009 Aug 19
2
Problem with predict.coxph
We occasionally utilize the coxph function in the survival library to fit multinomial logit models. (The breslow method produces the same likelihood function as the multinomial logit). We then utilize the predict function to create summary results for various combinations of covariates. For example:
2009 Feb 25
3
survival::survfit,plot.survfit
I am confused when trying the function survfit. my question is: what does the survival curve given by plot.survfit mean? is it the survival curve with different covariates at different points? or just the baseline survival curve? for example, I run the following code and get the survival curve #### library(survival) fit<-coxph(Surv(futime,fustat)~resid.ds+rx+ecog.ps,data=ovarian)
2006 Oct 03
2
maybe use voronoi.findrejectsites?
hi all members, please, i need you help... now a i´m working with veronoi polygons in a area with projections, but i need cut the polygons left. On other words, i need cut the polygons in the worked area. R help say that use the command voronoi.findrejectsites, but in this command i need put the numbers, any way...this command not cut!! do you can help me? Thank you for help me! José Bustos
2007 Aug 06
1
(Censboot, Z-score, Cox) How to use Z-score as the statistic within censboot?
Dear R Help list, My question is regarding extracting the standard error or Z-score from a cph or coxph call. My Cox model is: - modz=cph(Surv(TSURV,STATUS)~RAGE+DAGE+REG_WTIME_M+CLD_ISCH+POLY_VS, data=kidneyT,method="breslow", x=T, y=T) I've used names(modz) but can't see anything that will let me extract the Z scores for each coefficient or the standard errors in the same