znmeb@aracnet.com
2001-Apr-30 05:14 UTC
[Rd] Rgui 1.2.3 for Windows crashes reading huge CSV file (PR#927)
Full_Name: M. Edward Borasky Version: 1.2.3 (2001-04-26) OS: Windows 2000 Submission from: (NULL) (206.98.121.224) I have a Celeron with 192 MB of RAM. My paging file is 288 MB, which is the Windows recommended size. Here's the starting screen: -------------------------------------------------------------------------------- R : Copyright 2001, The R Development Core Team Version 1.2.3 (2001-04-26) R is free software and comes with ABSOLUTELY NO WARRANTY. You are welcome to redistribute it under certain conditions. Type `license()' or `licence()' for distribution details. R is a collaborative project with many contributors. Type `contributors()' for more information. Type `demo()' for some demos, `help()' for on-line help, or `help.start()' for a HTML browser interface to help. Type `q()' to quit R. [Previously saved workspace restored]> memory.size()[1] 8504720> memory.limit()[1] 200720384> ls()character(0) --------------------------------------------------------------------------------- Now I try to read a huge (60 megabytes, 688,500 rows by 10 columns) CSV file with "read.csv". Rgui grinds on it for some time before finally crashing. The error message is: The exception unknown software exception (0xc00000fd) occurred in the application at location 0x02672dcf. I watched this in the Task Manager and it peaked around 160 MB. Here's the first few lines of the input file (stock price data for all NYSE stocks for the past year) if you want to try this with some fake data -- just add 688,499 more rows of this stuff :-) Symbol,Date,Cvf,IsStock,Open,High,Low,Close,Volume,OpenInt "WLP",20000428,2,1,75.25,75.25,72.5,73.75,4151,0 "WLP",20000501,2,1,74,74.5,72.120002746582,72.4300003051758,2229,0 "WLP",20000502,2,1,72.370002746582,72.370002746582,71,71.870002746582,3629,0 I haven't tried this with the Linux version yet to see if it handles the situation a little more gracefully. I'll do that sometime and attach the results to this bug report. This isn't a major problem for me; it's easy for me to break the file up into individual stocks, or substitute weekly data for the daily data, which will reduce the file to 12 megabytes or so. -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-devel-request@stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._