Man, R has a steep learning curve (but I suppose you all know this). I have very little programming knowledge, so when I search for answers to my questions, I struggle with making sense of a lot of the pages. I have a spreadsheet that I've read into R using read.csv. I've also attached it. It looks like this (except there are 1600+ entries):> SundaySunDate SunTime SunScore 1 5/9/2010 0:00 0:00 127 2 6/12/2011 0:00 0:00 125 3 6/15/2008 0:04 0:04 98 4 8/3/2008 0:07 0:07 118 5 7/24/2011 0:07 0:07 122 6 5/25/2008 0:09 0:09 104 7 5/20/2012 0:11 0:11 124 8 10/18/2009 0:12 0:12 121 9 3/14/2010 0:12 0:12 117 10 1/2/2011 0:12 0:12 131 SunDate and SunTime are both factors. In order to change the class to something I can work with, I use the following: Sunday$SunTime<-as.POSIXlt(SunTime,tz=??,?%H:%M?) Sunday$SunDate<-as.POSIXlt(SunDate,tz=??,?%m/%d/%Y %H:%M?) Now, the str(Sunday) command yields: 'data.frame': 1644 obs. of 3 variables: $ SunDate : POSIXlt, format: "2010-05-09 00:00:00" "2011-06-12 00:00:00" ... $ SunTime : POSIXlt, format: "2012-06-18 00:00:00" "2012-06-18 00:00:00" ... $ SunScore: int 127 125 98 118 122 104 124 121 117 131 ... I think all the elements in Sunday are correct for me to do what I want to do, but I don't know how to do them. 1. How can I get the mean score by hour? For example, I want the mean score of all the entries between 0:00 and 0:59, then 1:00 and 1:59, etc. 2. Is it possible for me to create a histogram by hour for each score over a certain point? For example, I want to make a histogram of all scores above 140 by the hour they occurred in. Is that possible? These last few might not be possibe (at least with R), but I'll ask anyway. I've got another data set similar to the one above, except it's got 12,000 entries over four years. If I do the same commands as above to turn Date and Time into POSIXlt, is it possible for me to do the following: 1. The data was recorded at irregular intervals, and the difference between recorded points can range from anywhere between 1 hour and up to 7. Is it possible, when data isn't recorded between two points, to insert the hours that are unrecorded along with the average of what that hour is. This is sort of a pre-requisite for the next two. 2. If one of the entries has a Score above a certain point, is it possible to determine how long it was above that point and determine the mean for all the instances this occurred. For example: 01/01/11 01:00 AM 101 01/01/11 02:21 AM 142 01/01/11 03:36 AM 156 01/01/11 04:19 AM 130 01/01/11 05:12 AM 146 01/01/11 06:49 AM 116 01/01/11 07:09 AM 111 There are two spans where it's above 140. The two and three o'clock hours, and the 5 o'clock hour. So the mean time would be 1.5 hours. Is it possible for R to do this over a much larger time period? 3. If a score reaches a certain point, is it possible for R to determine the average time between that and when the score reaches another point. For example: 01/01/11 01:01 AM 101 01/01/11 02:21 AM 121 01/01/11 03:14 AM 134 01/01/11 04:11 AM 149 01/01/11 05:05 AM 119 01/01/11 06:14 AM 121 01/01/11 07:19 AM 127 01/01/11 08:45 AM 134 01/01/11 09:11 AM 142 01/01/11 10:10 AM 131 The score goes above 120 during the 2 AM hour and doesn't go above 140 until the 4 AM hour. Then it goes above 120 again in the 6 AM hour, but doesn't go above 140 until the 9 AM hour. So the average time to go from 120 to 140 is 2.5 hours. Can R does this over a much larger time frame? If anyone knows how to easily do any of these (particularly the first part), I'd greatly appreciate it. If some of these are possible, but aren't simple commands and require more in depth programming knowledge and time commitment, can someone at least tell me what sort of thing to look up? -- View this message in context: http://r.789695.n4.nabble.com/Questions-about-doing-analysis-based-on-time-tp4634230.html Sent from the R help mailing list archive at Nabble.com.
Hi and welcome to the R-help list. It would be much better for readers to get your data in a more easily used format. There is a function called dput() that will output your data in a way that R can read easily. We don't need to see all the data but perhaps hundred lines of it would be nice. Try this where your file is called "mydata" # just copy the line below and paste into R head(mydata, 100) # Now copy the output and paste it into your wordprocess as a reply to the list and we will have decent data to work with. John Kane Kingston ON Canada> -----Original Message----- > From: mikeedinger16 at gmail.com > Sent: Fri, 22 Jun 2012 09:21:40 -0700 (PDT) > To: r-help at r-project.org > Subject: [R] Questions about doing analysis based on time > > > I have a spreadsheet that I've read into R using read.csv. I've also > attached it. It looks like this (except there are 1600+ entries): > >> Sunday > SunDate SunTime SunScore > 1 5/9/2010 0:00 0:00 127 > 2 6/12/2011 0:00 0:00 125 > 3 6/15/2008 0:04 0:04 98 > 4 8/3/2008 0:07 0:07 118 > 5 7/24/2011 0:07 0:07 122 > 6 5/25/2008 0:09 0:09 104 > 7 5/20/2012 0:11 0:11 124 > 8 10/18/2009 0:12 0:12 121 > 9 3/14/2010 0:12 0:12 117 > 10 1/2/2011 0:12 0:12 131 > > SunDate and SunTime are both factors. In order to change the class to > something I can work with, I use the following: > > Sunday$SunTime<-as.POSIXlt(SunTime,tz=??,?%H:%M?) > Sunday$SunDate<-as.POSIXlt(SunDate,tz=??,?%m/%d/%Y %H:%M?) > > Now, the str(Sunday) command yields: > > 'data.frame': 1644 obs. of 3 variables: > $ SunDate : POSIXlt, format: "2010-05-09 00:00:00" "2011-06-12 00:00:00" > ... > $ SunTime : POSIXlt, format: "2012-06-18 00:00:00" "2012-06-18 00:00:00" > ... > $ SunScore: int 127 125 98 118 122 104 124 121 117 131 ... > > I think all the elements in Sunday are correct for me to do what I want > to > do, but I don't know how to do them. > > 1. How can I get the mean score by hour? For example, I want the mean > score> of all the entries between 0:00 and 0:59, then 1:00 and 1:59, etc. > 2. Is it possible for me to create a histogram by hour for each score > over a > certain point? For example, I want to make a histogram of all scores > above > 140 by the hour they occurred in. Is that possible? > > These last few might not be possibe (at least with R), but I'll ask > anyway. > I've got another data set similar to the one above, except it's got > 12,000 > entries over four years. If I do the same commands as above to turn Date > and Time into POSIXlt, is it possible for me to do the following: > > 1. The data was recorded at irregular intervals, and the difference > between > recorded points can range from anywhere between 1 hour and up to 7. Is > it > possible, when data isn't recorded between two points, to insert the > hours > that are unrecorded along with the average of what that hour is. This is > sort of a pre-requisite for the next two. > 2. If one of the entries has a Score above a certain point, is it > possible > to determine how long it was above that point and determine the mean for > all > the instances this occurred. For example: > 01/01/11 01:00 AM > 101 > 01/01/11 02:21 AM > 142 > 01/01/11 03:36 AM > 156 > 01/01/11 04:19 AM > 130 > 01/01/11 05:12 AM > 146 > 01/01/11 06:49 AM > 116 > 01/01/11 07:09 AM > 111 > There are two spans where it's above 140. The two and three o'clock > hours, > and the 5 o'clock hour. So the mean time would be 1.5 hours. Is it > possible for R to do this over a much larger time period? > > 3. If a score reaches a certain point, is it possible for R to determine > the average time between that and when the score reaches another point. > For > example: > 01/01/11 01:01 AM > 101 > 01/01/11 02:21 AM > 121 > 01/01/11 03:14 AM > 134 > 01/01/11 04:11 AM > 149 > 01/01/11 05:05 AM > 119 > 01/01/11 06:14 AM > 121 > 01/01/11 07:19 AM > 127 > 01/01/11 08:45 AM > 134 > 01/01/11 09:11 AM > 142 > 01/01/11 10:10 AM > 131 > The score goes above 120 during the 2 AM hour and doesn't go above 140 > until > the 4 AM hour. Then it goes above 120 again in the 6 AM hour, but > doesn't > go above 140 until the 9 AM hour. So the average time to go from 120 to > 140 > is 2.5 hours. Can R does this over a much larger time frame? > > If anyone knows how to easily do any of these (particularly the first > part), > I'd greatly appreciate it. > > If some of these are possible, but aren't simple commands and require > more > in depth programming knowledge and time commitment, can someone at least > tell me what sort of thing to look up? > > -- > View this message in context: > http://r.789695.n4.nabble.com/Questions-about-doing-analysis-based-on-time-tp4634230.html > Sent from the R help mailing list archive at Nabble.com. > > ______________________________________________. ____________________________________________________________ FREE 3D EARTH SCREENSAVER - Watch the Earth right on your desktop!
Arrgh yes I did mean dput(head(mydata, 100)). Thanks for catching it. John Kane Kingston ON Canada> -----Original Message----- > From: michael.weylandt at gmail.com > Sent: Fri, 22 Jun 2012 14:25:30 -0500 > To: jrkrideau at inbox.com > Subject: Re: [R] Questions about doing analysis based on time > > On Fri, Jun 22, 2012 at 2:18 PM, John Kane <jrkrideau at inbox.com> wrote: >> Hi and welcome to the R-help list. >> >> It would be much better for readers to get your data in a more easily >> used format. >> >> There is a function called dput() that will output your data in a way >> that R can read easily. >> >> We don't need to see all the data but perhaps hundred lines of it would >> be nice. >> >> Try this ?where your file is called "mydata" >> # just copy the line below and paste into R >> head(mydata, 100) > > I think you mean dput(head(mydata, 100)) > > OP: Once you put this up I'll give more reply, but for now I'd suggest > you try to put your data in a proper time series class (zoo/xts if I > might give a personal-ish plug) which will make all these calculations > much easier. > > Best, > Michael > >> >> # Now copy the output and paste it into your wordprocess as a reply to >> the list and we will have decent data to work with. >> >> John Kane >> Kingston ON Canada >> >> >>> -----Original Message----- >>> From: mikeedinger16 at gmail.com >>> Sent: Fri, 22 Jun 2012 09:21:40 -0700 (PDT) >>> To: r-help at r-project.org >>> Subject: [R] Questions about doing analysis based on time >>> >>> >>> I have a spreadsheet that I've read into R using read.csv. ?I've also >>> attached it. ?It looks like this (except there are 1600+ entries): >>> >>>> Sunday >>> ? ? ? ? ? ? ? SunDate SunTime SunScore >>> 1 ? ? ? 5/9/2010 0:00 ? ?0:00 ? ? ?127 >>> 2 ? ? ?6/12/2011 0:00 ? ?0:00 ? ? ?125 >>> 3 ? ? ?6/15/2008 0:04 ? ?0:04 ? ? ? 98 >>> 4 ? ? ? 8/3/2008 0:07 ? ?0:07 ? ? ?118 >>> 5 ? ? ?7/24/2011 0:07 ? ?0:07 ? ? ?122 >>> 6 ? ? ?5/25/2008 0:09 ? ?0:09 ? ? ?104 >>> 7 ? ? ?5/20/2012 0:11 ? ?0:11 ? ? ?124 >>> 8 ? ? 10/18/2009 0:12 ? ?0:12 ? ? ?121 >>> 9 ? ? ?3/14/2010 0:12 ? ?0:12 ? ? ?117 >>> 10 ? ? ?1/2/2011 0:12 ? ?0:12 ? ? ?131 >>> >>> SunDate and SunTime are both factors. ?In order to change the class to >>> something I can work with, I use the following: >>> >>> Sunday$SunTime<-as.POSIXlt(SunTime,tz=??,?%H:%M?) >>> Sunday$SunDate<-as.POSIXlt(SunDate,tz=??,?%m/%d/%Y %H:%M?) >>> >>> Now, the str(Sunday) command yields: >>> >>> 'data.frame': ? 1644 obs. of ?3 variables: >>> ?$ SunDate : POSIXlt, format: "2010-05-09 00:00:00" "2011-06-12 >>> 00:00:00" >>> ... >>> ?$ SunTime : POSIXlt, format: "2012-06-18 00:00:00" "2012-06-18 >>> 00:00:00" >>> ... >>> ?$ SunScore: int ?127 125 98 118 122 104 124 121 117 131 ... >>> >>> I think all the elements in Sunday are correct for me to do what I want >>> to >>> do, but I don't know how to do them. >>> >>> 1. How can I get the mean score by hour? ?For example, I want the mean >>> score >> >> >>> of all the entries between 0:00 and 0:59, then 1:00 ?and 1:59, etc. >>> 2. Is it possible for me to create a histogram by hour for each score >>> over a >>> certain point? ?For example, I want to make a histogram of all scores >>> above >>> 140 by the hour they occurred in. ?Is that possible? >>> >>> These last few might not be possibe (at least with R), but I'll ask >>> anyway. >>> I've got another data set similar to the one above, except it's got >>> 12,000 >>> entries over four years. ?If I do the same commands as above to turn >>> Date >>> and Time into POSIXlt, is it possible for me to do the following: >>> >>> 1. The data was recorded at irregular intervals, and the difference >>> between >>> recorded points can range from anywhere between 1 hour and up to 7. ?Is >>> it >>> possible, when data isn't recorded between two points, to insert the >>> hours >>> that are unrecorded along with the average of what that hour is. ?This >>> is >>> sort of a pre-requisite for the next two. >>> 2. If one of the entries has a Score above a certain point, is it >>> possible >>> to determine how long it was above that point and determine the mean >>> for >>> all >>> the instances this occurred. ?For example: >>> 01/01/11 01:00 AM >>> 101 >>> 01/01/11 02:21 AM >>> 142 >>> 01/01/11 03:36 AM >>> 156 >>> 01/01/11 04:19 AM >>> 130 >>> 01/01/11 05:12 AM >>> 146 >>> 01/01/11 06:49 AM >>> 116 >>> 01/01/11 07:09 AM >>> 111 >>> ? ? ? There are two spans where it's above 140. The two and three >>> o'clock >>> hours, >>> and the 5 o'clock hour. ?So the mean time would be 1.5 hours. ?Is it >>> possible for R to do this over a much larger time period? >>> >>> 3. ?If a score reaches a certain point, is it possible for R to >>> determine >>> the average time between that and when the score reaches another point. >>> For >>> example: >>> 01/01/11 01:01 AM >>> 101 >>> 01/01/11 02:21 AM >>> 121 >>> 01/01/11 03:14 AM >>> 134 >>> 01/01/11 04:11 AM >>> 149 >>> 01/01/11 05:05 AM >>> 119 >>> 01/01/11 06:14 AM >>> 121 >>> 01/01/11 07:19 AM >>> 127 >>> 01/01/11 08:45 AM >>> 134 >>> 01/01/11 09:11 AM >>> 142 >>> 01/01/11 10:10 AM >>> 131 >>> The score goes above 120 during the 2 AM hour and doesn't go above 140 >>> until >>> the 4 AM hour. ?Then it goes above 120 again in the 6 AM hour, but >>> doesn't >>> go above 140 until the 9 AM hour. ?So the average time to go from 120 >>> to >>> 140 >>> is 2.5 hours. ?Can R does this over a much larger time frame? >>> >>> If anyone knows how to easily do any of these (particularly the first >>> part), >>> I'd greatly appreciate it. >>> >>> If some of these are possible, but aren't simple commands and require >>> more >>> in depth programming knowledge and time commitment, can someone at >>> least >>> tell me what sort of thing to look up? >>> >>> -- >>> View this message in context: >>> http://r.789695.n4.nabble.com/Questions-about-doing-analysis-based-on-time-tp4634230.html >>> Sent from the R help mailing list archive at Nabble.com. >>> >>> ______________________________________________ >> . >> >> ____________________________________________________________ >> FREE 3D EARTH SCREENSAVER - Watch the Earth right on your desktop! >> >> ______________________________________________ >> R-help at r-project.org mailing list >> https://stat.ethz.ch/mailman/listinfo/r-help >> PLEASE do read the posting guide >> http://www.R-project.org/posting-guide.html >> and provide commented, minimal, self-contained, reproducible code.____________________________________________________________ FREE ONLINE PHOTOSHARING - Share your photos online with your friends and family! Visit http://www.inbox.com/photosharing to find out more!
Thanks to everyone for their help so far. It's been greatly appreciated. I have a new, but similar problem: I have data that I have broken down by hour (median/mean for each hour). I would like to break it down further, by each half hour (0:00-0:29, 0:30-0:59, 1:00-1:29, 1:30-1:59, etc). I thought cut.dates() from the chron package would be able to do it, but I can't find anything in the chron package documention. I'm fairly certain that if I can figure out how to break down the data by half hour that I can do all the other analysis just fine (median/mean for each half hour, etc) Here is a small sample of the data:> dput(head(SundayData, 100))structure(list(SunDate = structure(c(1273377600, 1307851200, 1213502640, 1217736420, 1311480420, 1211688540, 1337487060, 1255839120, 1268543520, 1293945120, 1280635980, 1309061640, 1322975640, 1297574280, 1221970740, 1253420340, 1218946800, 1329024060, 1290316920, 1224994980, 1218342420, 1269750420, 1257658080, 1322371680, 1214108940, 1312086540, 1260077400, 1228023060, 1315110660, 1281241920, 1272774960, 1224995820, 1275194220, 1246768860, 1302410460, 1234071780, 1305434580, 1232257500, 1243140300, 1284871500, 1247373960, 1265521560, 1273985160, 1310273160, 1226209620, 1270356420, 1330235280, 1222577400, 1310878200, 1324187400, 1242535860, 1336279860, 1283057520, 1291528320, 1324187580, 1330840380, 1298786100, 1307854500, 1236491880, 1298786280, 1233468180, 1280034240, 1230444300, 1213506360, 1251608760, 1215320820, 1304226420, 1320556080, 1299391740, 1286687580, 1296972780, 1296972780, 1321164780, 1260684960, 1315113420, 1287292680, 1292134800, 1303017600, 1307251200, 1278825720, 1238304180, 1212902640, 1231655100, 1254029100, 1311485100, 1295159160, 1220160420, 1297578540, 1300599000, 1241933640, 1225604100, 1269149880, 1283665140, 1244958120, 1245562980, 1289716980, 1235890020, 1282456080, 1279432140, 1279432140), class = c("POSIXct", "POSIXt"), tzone = ""), SunTime = structure(c(1L, 1L, 2L, 3L, 3L, 4L, 5L, 6L, 6L, 6L, 7L, 8L, 8L, 9L, 10L, 10L, 11L, 12L, 13L, 14L, 15L, 15L, 16L, 16L, 17L, 17L, 18L, 19L, 19L, 20L, 21L, 22L, 22L, 23L, 23L, 24L, 24L, 25L, 25L, 25L, 26L, 26L, 26L, 26L, 27L, 27L, 28L, 29L, 29L, 29L, 30L, 30L, 31L, 31L, 32L, 32L, 33L, 33L, 34L, 34L, 35L, 36L, 37L, 38L, 38L, 39L, 39L, 40L, 41L, 42L, 42L, 42L, 42L, 43L, 44L, 45L, 46L, 46L, 46L, 47L, 48L, 49L, 50L, 50L, 50L, 51L, 52L, 53L, 54L, 55L, 56L, 57L, 58L, 59L, 60L, 60L, 61L, 62L, 63L, 63L), .Label = c("0:00", "0:04", "0:07", "0:09", "0:11", "0:12", "0:13", "0:14", "0:18", "0:19", "0:20", "0:21", "0:22", "0:23", "0:27", "0:28", "0:29", "0:30", "0:31", "0:32", "0:36", "0:37", "0:41", "0:43", "0:45", "0:46", "0:47", "0:48", "0:50", "0:51", "0:52", "0:53", "0:55", "0:58", "1:03", "1:04", "1:05", "1:06", "1:07", "1:08", "1:09", "1:13", "1:16", "1:17", "1:18", "1:20", "1:22", "1:23", "1:24", "1:25", "1:26", "1:27", "1:29", "1:30", "1:34", "1:35", "1:38", "1:39", "1:42", "1:43", "1:47", "1:48", "1:49", "1:52", "1:54", "1:55", "1:56", "1:57", "1:59", "10:00", "10:04", "10:07", "10:08", "10:09", "10:10", "10:11", "10:12", "10:14", "10:15", "10:16", "10:18", "10:20", "10:22", "10:23", "10:24", "10:25", "10:26", "10:27", "10:28", "10:30", "10:31", "10:32", "10:33", "10:34", "10:35", "10:36", "10:37", "10:38", "10:39", "10:40", "10:41", "10:43", "10:44", "10:45", "10:47", "10:48", "10:49", "10:50", "10:51", "10:53", "10:54", "10:55", "10:56", "10:58", "10:59", "11:01", "11:02", "11:05", "11:06", "11:07", "11:09", "11:10", "11:12", "11:14", "11:15", "11:16", "11:20", "11:21", "11:22", "11:23", "11:24", "11:26", "11:27", "11:29", "11:30", "11:31", "11:33", "11:34", "11:35", "11:36", "11:37", "11:38", "11:39", "11:40", "11:43", "11:44", "11:46", "11:47", "11:49", "11:52", "11:56", "11:58", "11:59", "12:00", "12:01", "12:02", "12:03", "12:04", "12:05", "12:06", "12:07", "12:08", "12:09", "12:10", "12:11", "12:13", "12:14", "12:15", "12:17", "12:19", "12:21", "12:22", "12:24", "12:25", "12:26", "12:27", "12:28", "12:30", "12:31", "12:32", "12:34", "12:36", "12:37", "12:38", "12:39", "12:41", "12:45", "12:46", "12:47", "12:48", "12:49", "12:50", "12:51", "12:53", "12:54", "12:55", "12:56", "12:57", "12:58", "12:59", "13:00", "13:01", "13:02", "13:03", "13:04", "13:05", "13:06", "13:07", "13:08", "13:09", "13:10", "13:12", "13:13", "13:14", "13:15", "13:17", "13:18", "13:19", "13:20", "13:21", "13:23", "13:25", "13:26", "13:27", "13:30", "13:31", "13:32", "13:34", "13:35", "13:36", "13:38", "13:39", "13:40", "13:44", "13:45", "13:46", "13:47", "13:48", "13:49", "13:50", "13:51", "13:52", "13:53", "13:54", "13:55", "13:57", "13:58", "13:59", "14:00", "14:01", "14:02", "14:04", "14:05", "14:06", "14:07", "14:08", "14:11", "14:12", "14:13", "14:14", "14:15", "14:16", "14:17", "14:18", "14:20", "14:21", "14:22", "14:23", "14:25", "14:26", "14:28", "14:29", "14:30", "14:31", "14:32", "14:34", "14:35", "14:36", "14:37", "14:38", "14:40", "14:41", "14:42", "14:43", "14:45", "14:46", "14:47", "14:49", "14:50", "14:51", "14:52", "14:53", "14:54", "14:56", "14:57", "14:58", "14:59", "15:01", "15:02", "15:03", "15:04", "15:05", "15:07", "15:11", "15:12", "15:13", "15:14", "15:15", "15:17", "15:18", "15:19", "15:20", "15:22", "15:23", "15:24", "15:25", "15:26", "15:28", "15:29", "15:30", "15:31", "15:33", "15:34", "15:35", "15:36", "15:37", "15:38", "15:40", "15:41", "15:42", "15:43", "15:44", "15:45", "15:46", "15:47", "15:48", "15:50", "15:51", "15:53", "15:54", "15:56", "15:57", "15:59", "16:01", "16:02", "16:04", "16:05", "16:06", "16:07", "16:09", "16:13", "16:14", "16:15", "16:16", "16:18", "16:19", "16:20", "16:21", "16:23", "16:24", "16:25", "16:26", "16:29", "16:30", "16:31", "16:32", "16:34", "16:35", "16:36", "16:37", "16:38", "16:40", "16:42", "16:43", "16:44", "16:45", "16:46", "16:47", "16:48", "16:49", "16:52", "16:55", "16:56", "16:57", "16:58", "16:59", "17:03", "17:04", "17:05", "17:06", "17:07", "17:11", "17:12", "17:13", "17:14", "17:15", "17:17", "17:18", "17:19", "17:20", "17:22", "17:24", "17:25", "17:27", "17:29", "17:32", "17:34", "17:35", "17:36", "17:37", "17:38", "17:40", "17:41", "17:45", "17:47", "17:49", "17:50", "17:52", "17:53", "17:55", "17:56", "17:57", "17:58", "17:59", "18:00", "18:01", "18:02", "18:04", "18:05", "18:06", "18:08", "18:09", "18:10", "18:11", "18:12", "18:13", "18:14", "18:16", "18:17", "18:18", "18:19", "18:20", "18:21", "18:22", "18:23", "18:24", "18:26", "18:27", "18:30", "18:31", "18:32", "18:33", "18:35", "18:36", "18:37", "18:38", "18:40", "18:41", "18:42", "18:43", "18:44", "18:49", "18:54", "18:55", "18:56", "18:58", "18:59", "19:00", "19:01", "19:02", "19:04", "19:05", "19:07", "19:08", "19:09", "19:10", "19:11", "19:12", "19:13", "19:14", "19:15", "19:16", "19:17", "19:18", "19:19", "19:20", "19:21", "19:22", "19:23", "19:24", "19:25", "19:26", "19:27", "19:28", "19:29", "19:30", "19:31", "19:32", "19:34", "19:35", "19:36", "19:37", "19:38", "19:40", "19:41", "19:42", "19:43", "19:44", "19:46", "19:47", "19:48", "19:49", "19:50", "19:51", "19:52", "19:53", "19:54", "19:55", "19:56", "19:57", "19:58", "2:01", "2:02", "2:07", "2:10", "2:11", "2:13", "2:14", "2:15", "2:16", "2:17", "2:18", "2:19", "2:20", "2:22", "2:23", "2:24", "2:25", "2:26", "2:28", "2:29", "2:31", "2:33", "2:34", "2:35", "2:38", "2:39", "2:42", "2:44", "2:45", "2:46", "2:51", "2:55", "2:56", "2:58", "2:59", "20:00", "20:02", "20:04", "20:07", "20:08", "20:10", "20:11", "20:12", "20:13", "20:14", "20:15", "20:17", "20:19", "20:20", "20:21", "20:22", "20:23", "20:24", "20:25", "20:27", "20:28", "20:29", "20:31", "20:32", "20:33", "20:34", "20:35", "20:38", "20:39", "20:40", "20:41", "20:42", "20:43", "20:45", "20:46", "20:49", "20:52", "20:53", "20:54", "20:55", "20:56", "20:57", "20:59", "21:00", "21:01", "21:02", "21:03", "21:04", "21:05", "21:07", "21:08", "21:09", "21:10", "21:12", "21:14", "21:15", "21:20", "21:21", "21:22", "21:23", "21:24", "21:25", "21:26", "21:27", "21:28", "21:30", "21:31", "21:32", "21:34", "21:36", "21:37", "21:38", "21:39", "21:42", "21:45", "21:46", "21:48", "21:49", "21:50", "21:51", "21:53", "21:54", "21:55", "21:56", "21:57", "21:58", "21:59", "22:01", "22:02", "22:04", "22:05", "22:06", "22:08", "22:09", "22:10", "22:11", "22:12", "22:13", "22:14", "22:16", "22:17", "22:18", "22:19", "22:21", "22:22", "22:23", "22:24", "22:25", "22:26", "22:27", "22:28", "22:29", "22:30", "22:31", "22:32", "22:34", "22:35", "22:36", "22:37", "22:39", "22:40", "22:41", "22:42", "22:45", "22:48", "22:49", "22:53", "22:54", "22:56", "22:59", "23:00", "23:01", "23:02", "23:04", "23:05", "23:06", "23:08", "23:09", "23:10", "23:11", "23:14", "23:15", "23:17", "23:18", "23:19", "23:21", "23:22", "23:26", "23:27", "23:28", "23:30", "23:32", "23:33", "23:34", "23:36", "23:37", "23:38", "23:40", "23:41", "23:42", "23:43", "23:44", "23:45", "23:46", "23:47", "23:48", "23:49", "23:50", "23:52", "23:53", "23:54", "23:55", "23:58", "23:59", "3:02", "3:04", "3:05", "3:06", "3:07", "3:10", "3:11", "3:12", "3:15", "3:19", "3:20", "3:22", "3:24", "3:25", "3:27", "3:28", "3:30", "3:31", "3:32", "3:33", "3:34", "3:35", "3:36", "3:39", "3:43", "3:44", "3:45", "3:50", "3:51", "3:52", "3:53", "3:55", "3:56", "3:57", "3:58", "3:59", "4:00", "4:01", "4:02", "4:06", "4:07", "4:09", "4:16", "4:21", "4:26", "4:28", "4:29", "4:31", "4:32", "4:36", "4:37", "4:39", "4:40", "4:44", "4:46", "4:47", "4:52", "4:53", "4:54", "4:56", "4:57", "4:59", "5:03", "5:05", "5:06", "5:08", "5:14", "5:16", "5:17", "5:18", "5:19", "5:21", "5:23", "5:27", "5:29", "5:30", "5:32", "5:34", "5:36", "5:37", "5:38", "5:39", "5:40", "5:42", "5:43", "5:45", "5:46", "5:47", "5:48", "5:50", "5:52", "5:54", "5:56", "5:59", "6:00", "6:01", "6:06", "6:07", "6:08", "6:10", "6:11", "6:14", "6:15", "6:16", "6:17", "6:19", "6:20", "6:22", "6:24", "6:25", "6:27", "6:29", "6:30", "6:31", "6:32", "6:33", "6:35", "6:39", "6:40", "6:42", "6:43", "6:44", "6:46", "6:47", "6:48", "6:49", "6:50", "6:52", "6:54", "6:56", "6:57", "6:58", "6:59", "7:00", "7:02", "7:03", "7:04", "7:06", "7:08", "7:09", "7:13", "7:18", "7:20", "7:23", "7:25", "7:26", "7:28", "7:29", "7:33", "7:34", "7:35", "7:36", "7:38", "7:39", "7:40", "7:41", "7:43", "7:44", "7:45", "7:46", "7:49", "7:50", "7:52", "7:53", "7:54", "7:55", "7:56", "7:58", "7:59", "8:01", "8:02", "8:04", "8:05", "8:08", "8:11", "8:12", "8:13", "8:16", "8:19", "8:21", "8:24", "8:25", "8:28", "8:29", "8:30", "8:31", "8:34", "8:35", "8:36", "8:38", "8:39", "8:40", "8:41", "8:43", "8:45", "8:47", "8:48", "8:49", "8:50", "8:51", "8:52", "8:53", "9:00", "9:02", "9:03", "9:04", "9:06", "9:07", "9:08", "9:10", "9:13", "9:15", "9:16", "9:19", "9:20", "9:22", "9:24", "9:25", "9:26", "9:28", "9:29", "9:33", "9:34", "9:35", "9:36", "9:37", "9:40", "9:41", "9:42", "9:45", "9:46", "9:47", "9:48", "9:54", "9:56", "9:58", "9:59" ), class = "factor"), SunScore = c(127L, 125L, 98L, 118L, 122L, 104L, 124L, 121L, 117L, 131L, 142L, 102L, 136L, 200L, 88L, 142L, 84L, 23L, 103L, 167L, 116L, 168L, 151L, 105L, 95L, 180L, 55L, 105L, 144L, 155L, 174L, 141L, 153L, 130L, 112L, 116L, 147L, 143L, 165L, 183L, 161L, 140L, 134L, 113L, 151L, 138L, 200L, 116L, 146L, 200L, 152L, 151L, 117L, 156L, 82L, 68L, 200L, 129L, 81L, 114L, 144L, 110L, 132L, 53L, 42L, 117L, 65L, 127L, 106L, 129L, 159L, 159L, 110L, 120L, 114L, 129L, 54L, 81L, 154L, 139L, 132L, 69L, 32L, 60L, 74L, 82L, 135L, 161L, 106L, 114L, 71L, 60L, 120L, 81L, 121L, 59L, 130L, 128L, 200L, 155L)), .Names = c("SunDate", "SunTime", "SunScore" ), row.names = c(NA, 100L), class = "data.frame") -- View this message in context: http://r.789695.n4.nabble.com/Questions-about-doing-analysis-based-on-time-tp4634230p4635982.html Sent from the R help mailing list archive at Nabble.com.
Another new question: I want to be able to subset the data based on whether or not that data point was recorded on a holiday. The is.holiday() function from the chron package would be perfect for this. However, when I try it, the following happens (I'm also using the timeDate package):> holidays<-ChristmasDay(2008:2011) > holidays<-as.POSIXct(holidays, format="%Y-%m-%d") > subset(OverallData, is.holiday(OverallData$Date))[1] Date Score <0 rows> (or 0-length row.names) I'm not sure what I'm supposed to do to make this work "half.hours <- function(x){ s <- strsplit(as.character(x), ":") H <- as.integer(sapply(s, `[`, 1)) M <- as.integer(sapply(s, `[`, 2)) h <- (H*60 + M)/60 floor(h/0.5)*0.5 } half.hours(dat$SunTime) " When I did this, everything came out as NA. But I used the following code and it worked: halfhours<-function(x){ H<-hours(x) M<-minutes(x) ifelse(M<30, H, H+.5) } Thanks again for your help! -- View this message in context: http://r.789695.n4.nabble.com/Questions-about-doing-analysis-based-on-time-tp4634230p4636353.html Sent from the R help mailing list archive at Nabble.com.