Hi alI, I have a 3.1 GB Dataset ( with 11 coloumns and lots data in int and string). If I use read.table; it takes very long. It seems that my RAM is not big enough (overload) I have 3.2 RAM and 7GB SWAP, 64 Bit Ubuntu. Is there a best sultion to read a large data R? I have seen, that people suggest to use bigmemory package, ff. But it seems very complicated. I dont know how to start with that packages. i have tried to use bigmemory. But I got some kind of errors. Then I gave up. can someone give me an simple example how ot use ff or bigmemory?or maybe re better sollution? Thank you in advance, Edwin
type ?memory into R and that will explain what to do... S ----- Original Message ----- From: "Edwin Sendjaja" <edwin7 at web.de> To: <r-help at r-project.org> Sent: Tuesday, January 06, 2009 11:41 AM Subject: [R] Large Dataset> Hi alI, > > I have a 3.1 GB Dataset ( with 11 coloumns and lots data in int and > string). > If I use read.table; it takes very long. It seems that my RAM is not big > enough (overload) I have 3.2 RAM and 7GB SWAP, 64 Bit Ubuntu. > > Is there a best sultion to read a large data R? I have seen, that people > suggest to use bigmemory package, ff. But it seems very complicated. I > dont > know how to start with that packages. > > i have tried to use bigmemory. But I got some kind of errors. Then I gave > up. > > > can someone give me an simple example how ot use ff or bigmemory?or maybe > re > better sollution? > > > > Thank you in advance, > > > Edwin > > ______________________________________________ > R-help at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide > http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. >
Hi Simon, Thank for your reply. I have read ?Memory but I dont understand how to use. I am not sure if that can solve my problem. Can you tell me more detail? Thanks, Edwin> type > > ?memory > > into R and that will explain what to do... > > S > ----- Original Message ----- > From: "Edwin Sendjaja" <edwin7 at web.de> > To: <r-help at r-project.org> > Sent: Tuesday, January 06, 2009 11:41 AM > Subject: [R] Large Dataset > > > Hi alI, > > > > I have a 3.1 GB Dataset ( with 11 coloumns and lots data in int and > > string). > > If I use read.table; it takes very long. It seems that my RAM is not big > > enough (overload) I have 3.2 RAM and 7GB SWAP, 64 Bit Ubuntu. > > > > Is there a best sultion to read a large data R? I have seen, that people > > suggest to use bigmemory package, ff. But it seems very complicated. I > > dont > > know how to start with that packages. > > > > i have tried to use bigmemory. But I got some kind of errors. Then I > > gave up. > > > > > > can someone give me an simple example how ot use ff or bigmemory?or maybe > > re > > better sollution? > > > > > > > > Thank you in advance, > > > > > > Edwin > > > > ______________________________________________ > > R-help at r-project.org mailing list > > https://stat.ethz.ch/mailman/listinfo/r-help > > PLEASE do read the posting guide > > http://www.R-project.org/posting-guide.html > > and provide commented, minimal, self-contained, reproducible code.
When I do it on a Mac installation I get: Help for the topic "memory" was not found. Is that a Linux-specific function? Or perhaps you meant to type: ?Memory Which does produce useful information. -- David Winsemius > sessionInfo() R version 2.8.0 Patched (2008-11-14 r46932) i386-apple-darwin9.5.0 locale: en_US.UTF-8/en_US.UTF-8/C/C/en_US.UTF-8/en_US.UTF-8 attached base packages: [1] grid stats graphics grDevices utils datasets methods base other attached packages: [1] vcd_1.2-1 colorspace_1.0-0 MASS_7.2-45 rattle_2.4.0 loaded via a namespace (and not attached): [1] tools_2.8.0 On Jan 6, 2009, at 6:43 AM, Simon Pickett wrote:> type > > ?memory > > into R and that will explain what to do... > > S > ----- Original Message ----- From: "Edwin Sendjaja" <edwin7 at web.de> > To: <r-help at r-project.org> > Sent: Tuesday, January 06, 2009 11:41 AM > Subject: [R] Large Dataset > > >> Hi alI, >> >> I have a 3.1 GB Dataset ( with 11 coloumns and lots data in int >> and string). >> If I use read.table; it takes very long. It seems that my RAM is >> not big >> enough (overload) I have 3.2 RAM and 7GB SWAP, 64 Bit Ubuntu. >> >> Is there a best sultion to read a large data R? I have seen, that >> people >> suggest to use bigmemory package, ff. But it seems very >> complicated. I dont >> know how to start with that packages. >> >> i have tried to use bigmemory. But I got some kind of errors. Then >> I gave up. >> >> >> can someone give me an simple example how ot use ff or bigmemory?or >> maybe re >> better sollution? >> >> >> >> Thank you in advance, >> >> >> Edwin >> >> ______________________________________________ >> R-help at r-project.org mailing list >> https://stat.ethz.ch/mailman/listinfo/r-help >> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html >> and provide commented, minimal, self-contained, reproducible code. >> > > ______________________________________________ > R-help at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code.
I think he meant: ?Memory edwin> When I do it on a Mac installation I get: > > Help for the topic "memory" was not found. > > Is that a Linux-specific function? Or perhaps you meant to type: > > ?Memory > > Which does produce useful information. > > -- > David Winsemius > > > sessionInfo() > > R version 2.8.0 Patched (2008-11-14 r46932) > i386-apple-darwin9.5.0 > > locale: > en_US.UTF-8/en_US.UTF-8/C/C/en_US.UTF-8/en_US.UTF-8 > > attached base packages: > [1] grid stats graphics grDevices utils datasets > methods base > > other attached packages: > [1] vcd_1.2-1 colorspace_1.0-0 MASS_7.2-45 rattle_2.4.0 > > loaded via a namespace (and not attached): > [1] tools_2.8.0 > > On Jan 6, 2009, at 6:43 AM, Simon Pickett wrote: > > type > > > > ?memory > > > > into R and that will explain what to do... > > > > S > > ----- Original Message ----- From: "Edwin Sendjaja" <edwin7 at web.de> > > To: <r-help at r-project.org> > > Sent: Tuesday, January 06, 2009 11:41 AM > > Subject: [R] Large Dataset > > > >> Hi alI, > >> > >> I have a 3.1 GB Dataset ( with 11 coloumns and lots data in int > >> and string). > >> If I use read.table; it takes very long. It seems that my RAM is > >> not big > >> enough (overload) I have 3.2 RAM and 7GB SWAP, 64 Bit Ubuntu. > >> > >> Is there a best sultion to read a large data R? I have seen, that > >> people > >> suggest to use bigmemory package, ff. But it seems very > >> complicated. I dont > >> know how to start with that packages. > >> > >> i have tried to use bigmemory. But I got some kind of errors. Then > >> I gave up. > >> > >> > >> can someone give me an simple example how ot use ff or bigmemory?or > >> maybe re > >> better sollution? > >> > >> > >> > >> Thank you in advance, > >> > >> > >> Edwin > >> > >> ______________________________________________ > >> R-help at r-project.org mailing list > >> https://stat.ethz.ch/mailman/listinfo/r-help > >> PLEASE do read the posting guide > >> http://www.R-project.org/posting-guide.html and provide commented, > >> minimal, self-contained, reproducible code. > > > > ______________________________________________ > > R-help at r-project.org mailing list > > https://stat.ethz.ch/mailman/listinfo/r-help > > PLEASE do read the posting guide > > http://www.R-project.org/posting-guide.html and provide commented, > > minimal, self-contained, reproducible code.
Only a couple of weeks ago I had to deal with this. adjust the memory limit as follows, although you might not want 4000, that is quite high.... memory.limit(size = 4000) Simon. ----- Original Message ----- From: "Edwin Sendjaja" <edwin7 at web.de> To: "Simon Pickett" <simon.pickett at bto.org> Cc: <r-help at r-project.org> Sent: Tuesday, January 06, 2009 12:24 PM Subject: Re: [R] Large Dataset> Hi Simon, > > Thank for your reply. > I have read ?Memory but I dont understand how to use. I am not sure if > that > can solve my problem. Can you tell me more detail? > > Thanks, > > Edwin > >> type >> >> ?memory >> >> into R and that will explain what to do... >> >> S >> ----- Original Message ----- >> From: "Edwin Sendjaja" <edwin7 at web.de> >> To: <r-help at r-project.org> >> Sent: Tuesday, January 06, 2009 11:41 AM >> Subject: [R] Large Dataset >> >> > Hi alI, >> > >> > I have a 3.1 GB Dataset ( with 11 coloumns and lots data in int and >> > string). >> > If I use read.table; it takes very long. It seems that my RAM is not >> > big >> > enough (overload) I have 3.2 RAM and 7GB SWAP, 64 Bit Ubuntu. >> > >> > Is there a best sultion to read a large data R? I have seen, that >> > people >> > suggest to use bigmemory package, ff. But it seems very complicated. I >> > dont >> > know how to start with that packages. >> > >> > i have tried to use bigmemory. But I got some kind of errors. Then I >> > gave up. >> > >> > >> > can someone give me an simple example how ot use ff or bigmemory?or >> > maybe >> > re >> > better sollution? >> > >> > >> > >> > Thank you in advance, >> > >> > >> > Edwin >> > >> > ______________________________________________ >> > R-help at r-project.org mailing list >> > https://stat.ethz.ch/mailman/listinfo/r-help >> > PLEASE do read the posting guide >> > http://www.R-project.org/posting-guide.html >> > and provide commented, minimal, self-contained, reproducible code. > > >
Yup, it was a typo. But I always try capital if lower case doesnt work, Sorry. ----- Original Message ----- From: "David Winsemius" <dwinsemius at comcast.net> To: "Simon Pickett" <simon.pickett at bto.org> Cc: "Edwin Sendjaja" <edwin7 at web.de>; <r-help at r-project.org> Sent: Tuesday, January 06, 2009 12:40 PM Subject: Re: [R] Large Dataset> When I do it on a Mac installation I get: > > Help for the topic "memory" was not found. > > Is that a Linux-specific function? Or perhaps you meant to type: > > ?Memory > > Which does produce useful information. > > -- David Winsemius > > > sessionInfo() > R version 2.8.0 Patched (2008-11-14 r46932) > i386-apple-darwin9.5.0 > > locale: > en_US.UTF-8/en_US.UTF-8/C/C/en_US.UTF-8/en_US.UTF-8 > > attached base packages: > [1] grid stats graphics grDevices utils datasets methods > base > > other attached packages: > [1] vcd_1.2-1 colorspace_1.0-0 MASS_7.2-45 rattle_2.4.0 > > loaded via a namespace (and not attached): > [1] tools_2.8.0 > > > On Jan 6, 2009, at 6:43 AM, Simon Pickett wrote: > >> type >> >> ?memory >> >> into R and that will explain what to do... >> >> S >> ----- Original Message ----- From: "Edwin Sendjaja" <edwin7 at web.de> >> To: <r-help at r-project.org> >> Sent: Tuesday, January 06, 2009 11:41 AM >> Subject: [R] Large Dataset >> >> >>> Hi alI, >>> >>> I have a 3.1 GB Dataset ( with 11 coloumns and lots data in int and >>> string). >>> If I use read.table; it takes very long. It seems that my RAM is not >>> big >>> enough (overload) I have 3.2 RAM and 7GB SWAP, 64 Bit Ubuntu. >>> >>> Is there a best sultion to read a large data R? I have seen, that >>> people >>> suggest to use bigmemory package, ff. But it seems very complicated. I >>> dont >>> know how to start with that packages. >>> >>> i have tried to use bigmemory. But I got some kind of errors. Then I >>> gave up. >>> >>> >>> can someone give me an simple example how ot use ff or bigmemory?or >>> maybe re >>> better sollution? >>> >>> >>> >>> Thank you in advance, >>> >>> >>> Edwin >>> >>> ______________________________________________ >>> R-help at r-project.org mailing list >>> https://stat.ethz.ch/mailman/listinfo/r-help >>> PLEASE do read the posting guide >>> http://www.R-project.org/posting-guide.html >>> and provide commented, minimal, self-contained, reproducible code. >>> >> >> ______________________________________________ >> R-help at r-project.org mailing list >> https://stat.ethz.ch/mailman/listinfo/r-help >> PLEASE do read the posting guide >> http://www.R-project.org/posting-guide.html >> and provide commented, minimal, self-contained, reproducible code. > >