Tom Kraljevic
2014-Apr-26 04:23 UTC
[Rd] Please make Pre-3.1 read.csv (type.convert) behavior available
Hi, We at 0xdata use Java and R together, and the new behavior for read.csv has made R unable to read the output of Java?s Double.toString(). This, needless to say, is disruptive for us. (Actually, it was downright shocking.) +1 for restoring old behavior. Thanks, Tom
Duncan Murdoch
2014-Apr-26 11:28 UTC
[Rd] Please make Pre-3.1 read.csv (type.convert) behavior available
On 26/04/2014, 12:23 AM, Tom Kraljevic wrote:> > Hi, > > We at 0xdata use Java and R together, and the new behavior for read.csv has > made R unable to read the output of Java?s Double.toString().It may be less convenient, but it's certainly not "unable". Use colClasses.> > This, needless to say, is disruptive for us. (Actually, it was downright shocking.)It wouldn't have been a shock if you had tested pre-release versions. Commercial users of R should be contributing to its development, and that's a really easy way to do so. Duncan Murdoch> > +1 for restoring old behavior.
Andrew Piskorski
2014-Apr-27 15:31 UTC
[Rd] Please make Pre-3.1 read.csv (type.convert) behavior available
On Fri, Apr 25, 2014 at 09:23:23PM -0700, Tom Kraljevic wrote:> This, needless to say, is disruptive for us. (Actually, it was downright shocking.)It WAS somewhat shocking. I trust the R core team to get things right, and (AFAICT) they nearly always do. This was an exception, and shocking mostly in that it was so obviously wrong to completely discard all possibility of backwards compatibility. The old type.convert() functionality worked fine and was very useful, so the *obviously* right thing to do would be to at least retain the old behavior as a (non-default) option. Reproducing the old behavior in user R code is not simple. For anybody else stuck with this, you can do it (probably inefficiently) with the two functions below. Create your own version of read.table() that calls the dtk.type.convert() below instead of the stock type.convert(). It's not pretty, but that will do it. dtk.type.convert <- function(xx ,... ,ignore.signif.p=TRUE) { # Add backwards compatibility to R 3.1's "new feature": if(ignore.signif.p && all(dtk.can.be.numeric(xx ,ignore.na.p=TRUE))) { if(all(is.na(xx))) type.convert(xx ,...) else methods::as(xx ,"numeric") } else type.convert(xx ,...) } dtk.can.be.numeric <- function(xx ,ignore.na.p=TRUE) { # Test whether a value can be converted to numeric without becoming NA. # AKA, can this value be usefully represented as numeric? # Optionally ignore NAs already present in the incoming data. old.warn <- options(warn = -1) ; on.exit(options(old.warn)) aa <- !is.na(as.numeric(xx)) if(ignore.na.p) (is.na(xx) | aa) else aa } -- Andrew Piskorski <atp at piskorski.com>