search for: example_13

Displaying 4 results from an estimated 4 matches for "example_13".

2009 Jul 14
2
How to import BIG csv files with separate "map"?
Hi all, I am having problems importing a VERY large dataset in R. I have looked into the package ff, and that seems to suit me, but also, from all the examples I have seen, it either requires a manual creation of the database, or it needs a read.table kind of step. Being a survey kind of data the file is big (like 20,000 times 50,000 for a total of about 1.2Gb in plain text) the memory I have
2010 Mar 27
7
large dataset
Hi I have a question, as im not able to import a csv file which contains a big dataset(100.000 records) someone knows how many records R can handle without giving problems? What im facing when i try to import the file is that R generates more than 100.000 records and is very slow... thanks a lot!!!
2010 Jan 19
2
Memory usage in read.csv()
I'm sure this has gotten some attention before, but I have two CSV files generated from vmstat and free that are roughly 6-8 Mb (about 80,000 lines) each. When I try to use read.csv(), R allocates all available memory (about 4.9 Gb) when loading the files, which is over 300 times the size of the raw data. Here are the scripts used to generate the CSV files as well as the R code: Scripts (run
2011 Sep 14
4
Reading large, non-tabular files
Dear R-help, I have a very large ascii data file, of which I only want to read in selected lines (e.g. on fourth of the lines); determining which lines depends on the lines content. So far, I have found two approaches for doing this in R; 1) Read the file line by line using a repeat-loop and save the result in a temporary file or a variable, and 2) Read the entire file and filter/reshape it using