I have hundreds of CSV files coming in from another program that have a text field representing the date & time combined together. I need to strip the time and keep the date. How could I do that? In the example below, on the first line I need to keep the 6/15/2009, turning it into a date that R recognizes, but I need to throw away the time portion completely.> read.csv("C:\\D1\\F1-V1.csv", header=FALSE)[,c(1,7)]V1 V7 1 6/10/2009 10:04:00 AM 91 2 6/15/2009 9:47:00 AM -279 3 6/15/2009 9:47:00 AM 861 4 6/22/2009 9:47:00 AM 771 5 6/22/2009 4:01:00 PM -179 6 6/24/2009 2:53:00 PM 61 7 7/2/2009 9:47:00 AM 491 8 7/6/2009 9:47:00 AM 81 9 7/13/2009 10:04:00 AM 1681 Thanks, Mark
Mark - Here's a few possibilites:> dts = c('6/10/2009 10:04:00 AM','6/15/2009 9:47:00 AM','6/15/2009 9:47:00 AM') > as.Date(sapply(strsplit(dts,' '),'[',1),'%m/%d/%Y')[1] "2009-06-10" "2009-06-15" "2009-06-15"> as.Date(sub('(\\d+/\\d+/\\d+) .*','\\1',dts),'%m/%d/%Y')[1] "2009-06-10" "2009-06-15" "2009-06-15"> as.Date(sub('\\d+:\\d+:\\d+ [AP]M','',dts),'%m/%d/%Y')[1] "2009-06-10" "2009-06-15" "2009-06-15"> as.Date(as.POSIXct(dts,format='%m/%d/%Y %H:%M:%S %p'))[1] "2009-06-10" "2009-06-15" "2009-06-15" - Phil Spector Statistical Computing Facility Department of Statistics UC Berkeley spector at stat.berkeley.edu On Tue, 8 Feb 2011, Mark Knecht wrote:> I have hundreds of CSV files coming in from another program that have > a text field representing the date & time combined together. I need to > strip the time and keep the date. How could I do that? > > In the example below, on the first line I need to keep the 6/15/2009, > turning it into a date that R recognizes, but I need to throw away the > time portion completely. > > >> read.csv("C:\\D1\\F1-V1.csv", header=FALSE)[,c(1,7)] > V1 V7 > 1 6/10/2009 10:04:00 AM 91 > 2 6/15/2009 9:47:00 AM -279 > 3 6/15/2009 9:47:00 AM 861 > 4 6/22/2009 9:47:00 AM 771 > 5 6/22/2009 4:01:00 PM -179 > 6 6/24/2009 2:53:00 PM 61 > 7 7/2/2009 9:47:00 AM 491 > 8 7/6/2009 9:47:00 AM 81 > 9 7/13/2009 10:04:00 AM 1681 > > Thanks, > Mark > > ______________________________________________ > R-help at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. >
Try as.Date() with a suitable format (it only knows about internationally standard formats), e.g. maybe you mean> as.Date("6/10/2009 10:04:00 AM", format="%m/%d/%Y")[1] "2009-06-10" On Tue, 8 Feb 2011, Mark Knecht wrote:> I have hundreds of CSV files coming in from another program that have > a text field representing the date & time combined together. I need to > strip the time and keep the date. How could I do that? > > In the example below, on the first line I need to keep the 6/15/2009, > turning it into a date that R recognizes, but I need to throw away the > time portion completely. > > >> read.csv("C:\\D1\\F1-V1.csv", header=FALSE)[,c(1,7)] > V1 V7 > 1 6/10/2009 10:04:00 AM 91 > 2 6/15/2009 9:47:00 AM -279 > 3 6/15/2009 9:47:00 AM 861 > 4 6/22/2009 9:47:00 AM 771 > 5 6/22/2009 4:01:00 PM -179 > 6 6/24/2009 2:53:00 PM 61 > 7 7/2/2009 9:47:00 AM 491 > 8 7/6/2009 9:47:00 AM 81 > 9 7/13/2009 10:04:00 AM 1681 > > Thanks, > Mark > > ______________________________________________ > R-help at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. >-- Brian D. Ripley, ripley at stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax: +44 1865 272595
On Tue, Feb 8, 2011 at 11:30 AM, Prof Brian Ripley <ripley at stats.ox.ac.uk> wrote:> Try as.Date() with a suitable format (it only knows about internationally > standard formats), e.g. maybe you mean > >> as.Date("6/10/2009 10:04:00 AM", format="%m/%d/%Y") > > [1] "2009-06-10" >Thank you! - Mark