And this one is only from last week. Please, read the posting guides carefully.
Cheers
Joris
---------- Forwarded message ----------
From: Joris Meys <jorismeys at gmail.com>
Date: Sat, Jun 5, 2010 at 11:04 PM
Subject: Re: [R] What is the largest in memory data object you've
worked with in R?
To: Nathan Stephens <nwstephens at gmail.com>
Cc: r-help <r-help at r-project.org>
You have to take some things into account :
- the maximum memory set for R might not be the maximum memory available
- R needs the memory not only for the dataset. Matrix manipulations
require frquently double of the amount of memory taken by the dataset.
- memory allocation is important when dealing with large datasets.
There is plenty of information about that
- R has some packages to get around memory problems with big datasets.
Read this discussione for example :
http://tolstoy.newcastle.edu.au/R/help/05/05/4507.html
and this page of Matthew Keller is a good summary too :
http://www.matthewckeller.com/html/memory.html
Cheers
Joris
On Sat, Jun 5, 2010 at 12:32 AM, Nathan Stephens <nwstephens at gmail.com>
wrote:> For me, I've found that I can easily work with 1 GB datasets. This
includes
> linear models and aggregations. Working with 5 GB becomes cumbersome.
> Anything over that, and R croaks. I'm using a dual quad core Dell with
48
> GB of RAM.
>
> I'm wondering if there is anyone out there running jobs in the 100 GB
> range. If so, what does your hardware look like?
>
> --Nathan
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
--
Ghent University
Faculty of Bioscience Engineering
Department of Applied mathematics, biometrics and process control
tel : +32 9 264 59 87
Joris.Meys at Ugent.be
-------------------------------
Disclaimer : http://helpdesk.ugent.be/e-maildisclaimer.php
On Mon, Jun 14, 2010 at 12:07 PM, Meenakshi
<meenakshichidambaram at gmail.com> wrote:>
> HI,
>
> I want to import 1.5G CSV file in R.
> But the following error comes:
>
> 'Victor allocation 12.4 size'
>
> How to read the large CSV file in R .
>
> Any one can help me?
>
> --
> View this message in context:
http://r.789695.n4.nabble.com/Large-Data-tp2254130p2254130.html
> Sent from the R help mailing list archive at Nabble.com.
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
--
Joris Meys
Statistical consultant
Ghent University
Faculty of Bioscience Engineering
Department of Applied mathematics, biometrics and process control
tel : +32 9 264 59 87
Joris.Meys at Ugent.be
-------------------------------
Disclaimer : http://helpdesk.ugent.be/e-maildisclaimer.php