Displaying 4 results from an estimated 4 matches for "362mb".
Did you mean:
32mb
2006 Oct 28
0
ALARM!!!! Re: regarding large csv file import
...ses' so that read.csv does not have to
guess at the types.
You will probably need more memory depending on the type of data. If I
assume that it is numeric and that it takes about 6 characters to specify
a number, then you have approximately 45M numbers in the file and this
will take up 362MB for a single object. You should have at least 3X the
size of the largest object to do any processing since copies will have to
be made.
I would suggest partitioning the file and processing in parts. You can
also put it in a database and 'sample' the rows that you want to process....
2006 Oct 28
0
ALARM!!!! Re: regarding large csv file import
...ses' so that read.csv does not have to
guess at the types.
You will probably need more memory depending on the type of data. If I
assume that it is numeric and that it takes about 6 characters to specify
a number, then you have approximately 45M numbers in the file and this
will take up 362MB for a single object. You should have at least 3X the
size of the largest object to do any processing since copies will have to
be made.
I would suggest partitioning the file and processing in parts. You can
also put it in a database and 'sample' the rows that you want to process....
2006 Oct 28
0
ALARM!!!! Re: regarding large csv file import
...ses' so that read.csv does not have to
guess at the types.
You will probably need more memory depending on the type of data. If I
assume that it is numeric and that it takes about 6 characters to specify
a number, then you have approximately 45M numbers in the file and this
will take up 362MB for a single object. You should have at least 3X the
size of the largest object to do any processing since copies will have to
be made.
I would suggest partitioning the file and processing in parts. You can
also put it in a database and 'sample' the rows that you want to process....
2006 Oct 28
0
ALARM!!!! Re: regarding large csv file import
...ses' so that read.csv does not have to
guess at the types.
You will probably need more memory depending on the type of data. If I
assume that it is numeric and that it takes about 6 characters to specify
a number, then you have approximately 45M numbers in the file and this
will take up 362MB for a single object. You should have at least 3X the
size of the largest object to do any processing since copies will have to
be made.
I would suggest partitioning the file and processing in parts. You can
also put it in a database and 'sample' the rows that you want to process....