similar to: Post-hoc t-tests in 2-way repeated measure ANOVA

Displaying 20 results from an estimated 1000 matches similar to: "Post-hoc t-tests in 2-way repeated measure ANOVA"

2004 Aug 08
1
(REPOST) Simple main effects in 2-way repeated measure ANOVA
Hi all I am running a 2-way repeated measure anova with 1 between-subjects factor (Group=treatment, control), and 1 within-subject factor (Time of measurement: time1, time2). I extract the results of the anova with: summary(aov(effect ~ Group*Time + Error=Subj/Time, data=mydata)) Now, this must be clearly a dumb question, but how can I quickly extract in R all the post-hoc t-tests for the
2006 Feb 15
2
arrays of lists in R ("cell arrays" in Matlab)
Dear all I would like to have some data in the form of a 2-dimensional array (matrix) of lists, so that I can easily find the desired list object by indexing the structure by rows and columns. In matlab there exists a data type called "cell array": a matrix of "cells", which are composite objects very similar to R lists. I know that in R you can create 1-dimensional
2004 Oct 15
1
power in a specific frequency band
Dear R users I have a really simple question (hoping for a really simple answer :-): Having estimated the spectral density of a time series "x" (heart rate data) with: x.pgram <- spectrum(x,method="pgram") I would like to compute the power in a specific energy band. Assuming that frequency(x)=4 (Hz), and that I am interested in the band between f1 and f2, is the
2007 Jan 19
4
Newbie question: Statistical functions (e.g., mean, sd) in a "transform" statement?
Greetings listeRs - Given a data frame such as times time1 time2 time3 time4 1 70.408543 48.92378 7.399605 95.93050 2 17.231940 27.48530 82.962916 10.20619 3 20.279220 10.33575 66.209290 30.71846 4 NA 53.31993 12.398237 35.65782 5 9.295965 NA 48.929201 NA 6 63.966518 42.16304 1.777342 NA one can use "transform" to
2010 Nov 15
3
merge two dataset and replace missing by 0
Hi r users, I have two data sets (X1, X2). For example, time1<-c( 0, 8, 15, 22, 43, 64, 85, 106, 127, 148, 169, 190 ,211 ) outpue1<-c(171 ,164 ,150 ,141 ,109 , 73 , 47 ,26 ,15 ,12 ,6 ,2 ,1 ) X1<-cbind(time1,outpue1) time2<-c( 0 ,8 ,15 , 22 ,43 , 64 ,85 ,106 ,148) output2<-c( 5 ,5 ,4 ,5 ,5 ,4 ,1 ,2 , 1 ) X2<-cbind(time2,output2) I want to
2008 Apr 09
2
fuzzy merge
Hi, I would like to merge two data frames. It is just that I want the merging to be done with some kind of a fuzzy criterion. Let me explain. My first data frame looks like this : ID1 time1 dt 1 2008-01-02 13:11 10 2 2008-01-02 14:20 20 3
2010 Sep 02
2
date
Hello all, I've 2 strings that representing the start and end values of a date and time. For example, time1 <- c("21/04/2005","23/05/2005","11/04/2005") time2 <- c("15/07/2009", "03/06/2008", "15/10/2005") as.difftime(time1,time2) Time differences in secs [1] NA NA NA attr(,"tzone") [1] "" How can i
2010 Jan 16
2
Extracing only Unique Rows based on only 1 Column
To Whomever is Interested, I have spent several days searching the web, help files, the R wiki and the archives of this mailing list for a solution to this problem, but nonetheless I apologize in advance if I have missed something obvious. The problem is this; I have a 5-column data frame with about 4.2 million rows, and want to create a new (and hopefully much smaller) data frame that
2007 Dec 16
4
improving a bar graph
Hello, Below is the code for a basic bar graph. I was seeking advice regarding the following: (a) For each time period there are values from 16 people. How I can change the colour value so that each person has a different colour, which recurs across each of the three graphs/tie epriods? (b) I have seen much more sophisticated examples using lattice (e.g each person has a separate
2002 Apr 20
2
integration of a discrete function
Dear R list I am looking for a function in R that computes the integration of a discrete curve, such as a power spectrum, in a specified interval (in my case, that would be 'power in a certain frequency band'). I found only functions, such as 'integrate', that perform adaptive quadrature on analytic functions, and not on a curve specified as a set of (x,y) pairs. I have the
2011 Jul 19
1
Measuring and comparing .C and .Call overhead
Further pursuing my curiosity to measure the efficiency of R/C++ interface, I conducted a simple matrix-vector multiplication test using .C and .Call functions in R. In each case, I measured the execution time in R, as well as inside the C++ function. Subtracting the two, I came up with a measure of overhead associated with each call. I assume that this overhead would be non-existent of the entire
2009 Sep 11
1
What determines the unit of POSIXct differences?
Dear All, what determines if a difference between POSIXct objects gets expressed in days or seconds? In the following example, it's sometimes seconds, sometimes days. as.POSIXct('2009-09-01') - as.POSIXct(NA) Time difference of NA secs c(as.POSIXct('2009-09-01'), as.POSIXct(NA)) - c(as.POSIXct('2009-09-01'), as.POSIXct('2009-08-31')) Time differences in
2011 Sep 26
3
survival analysis: interval censored data
hello: my data looks like: time1  time2   event  catagoria 2004    2006        1            C 2004    2005        0            C 2005    2010        1            E 2007    2009        1            C 2006    2007        0            E 2008    2010        0            C 2008    2010        1            E ... and the census interval is 1 year I have tried  this
2004 Jul 04
2
Random intercept model with time-dependent covariates, results different from SAS
Dear list-members I am new to R and a statistics beginner. I really like the ease with which I can extract and manipulate data in R, and would like to use it primarily. I've been learning by checking analyses that have already been run in SAS. In an experiment with Y being a response variable, and group a 2-level between-subject factor, and time a 5-level within-subject factor. 2
2012 Nov 30
1
help on "stacking" matrices up
Dear All,   #I have the following code   Dose<-1000 Tinf <-0.5 INTERVAL <-8 TIME8 <-matrix(c((0*INTERVAL):(1*INTERVAL))) TIME7 <-matrix(c((0*INTERVAL):(2*INTERVAL))) TIME6 <-matrix(c((0*INTERVAL):(3*INTERVAL))) TIME5 <-matrix(c((0*INTERVAL):(4*INTERVAL))) TIME4 <-matrix(c((0*INTERVAL):(5*INTERVAL))) TIME3 <-matrix(c((0*INTERVAL):(6*INTERVAL))) TIME2
2007 Dec 31
3
Survival analysis with no events in one treatment group
I'm trying to fit a Cox proportional hazards model to some hospital admission data. About 25% of the patients have had at least one admission, and of these, 40% have had two admissions within the 12 month period of the study. Each patients has had one of 4 treatments, and one of the treatment groups has had no admissions for the period. I used:
2013 Mar 21
1
All unique combinations
Dear all, I would like to have all unique combinations in the following matrix TimeIndex<- rbind (c(1,"Week_of_21_07-29_03"),           c(2,"Thursday_21_03"),           c(3,"Friday_22_03"),           c(4,"Saturday_23_03"),           c(5,"Sunday_24_03"),           c(6,"Monday_25_03"),           c(7,"Tuesday_26_03"),        
2008 Oct 28
3
Dose Profile
Hi Everyone, I have data in a long format e.g. there is one row per patient but each follow-up appointment is included in the row. So, a snippet of the data looks like this: TrialNo Drug Sex Rand Adate1 Date1 Dose1 Time1 Adate2 Date2 Dose2 Time2 B1001029 LTG M 15719 30/04/2003 15825 150 106 29/08/2003 15946 200 227 B1117003 LTG M 15734 30/04/2003 15825 200 91 03/09/2003 15951 250 217
2011 Nov 11
1
How to compute time interval?
time1 = 2008-03-09 time2 = 2010-9-10 How to compute how many years between time1 and time2? Thanks! best [[alternative HTML version deleted]]
2009 Jan 27
2
Can I create a timeDate object using only year and week of the year values?
For a model I am working on, I have samples organized by year and week of the year. For this model, the data (year and week) comes from the basic sample data, but I require a value representing the amount of time since the sample was taken (actually, for the purpose of the model, it is sufficient to use the number of weeks from the middle of the sample week to the present). What I have found so