Is the problem reading the file in or processing it after it has been read
in?
Bert
On Fri, Nov 8, 2024 at 5:13?PM Jeff Newmiller via R-help <
r-help at r-project.org> wrote:
> Can you tell us what is wrong with the "chunked" package which
comes up
> when you Google "r read large file in chunks"?
>
> On November 8, 2024 4:58:18 PM PST, Val <valkremk at gmail.com>
wrote:
> >Hi All,
> >
> >I am reading data file ( > 1B rows) and do some date formatting like
> > dat=fread(mydatafile)
> > dat$date1 <- as.Date(ymd(dat$date1))
> >
> >However, I am getting an error message saying that
> > Error: cons memory exhausted (limit reached?)
> >
> >The script was working when the number rows were around 650M.
> >
> >Is there another way to handle a big data set in R?
> >
> >
> >Thank you.
> >
> >______________________________________________
> >R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see
> >https://stat.ethz.ch/mailman/listinfo/r-help
> >PLEASE do read the posting guide
> https://www.R-project.org/posting-guide.html
> >and provide commented, minimal, self-contained, reproducible code.
>
> --
> Sent from my phone. Please excuse my brevity.
>
> ______________________________________________
> R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> https://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
[[alternative HTML version deleted]]