I have a web-app created for reporting purpose with over 150,000 rows in the DB that has anywhere between 3-8 clauses being run on it everytime someone creats a report. Reports return back anywere from a couple hundred rows to over 100,000 if I am exporting it to excel. Their is between 200-500 reports being created a day. I was wondering how much ram and what hardware requirements do you think such a system would require? Correct me if I am wrong but the slowless I think is due to the amount of ram allocated to mySQL. Apache needs to be restarted every couple of days. Thanks for your input. Your Friend, John -- John Kopanas john-Iau1QiYlxLpBDgjK7y7TUQ@public.gmane.org http://www.kopanas.com http://www.cusec.net http://www.soen.info --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group. To post to this group, send email to rubyonrails-talk-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org To unsubscribe from this group, send email to rubyonrails-talk-unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org For more options, visit this group at http://groups.google.com/group/rubyonrails-talk?hl=en -~----------~----~----~----~------~----~------~--~---
> Reports return back anywere from a couple hundred rows to over 100,000 > if I am exporting it to excel.You need to paginate and write this in chunks. Fetching 100K rows is going to take enormous amounts of RAM. Then instantiating them as objects is going to take another elephant. The simple way to do this is just to use offset and limit. Fetch 1k objects at a time, process, write to disk, grab the next 1k and so forth. --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group. To post to this group, send email to rubyonrails-talk-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org To unsubscribe from this group, send email to rubyonrails-talk-unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org For more options, visit this group at http://groups.google.com/group/rubyonrails-talk?hl=en -~----------~----~----~----~------~----~------~--~---
On 5/15/07, DHH <david.heinemeier-Re5JQEeQqe8AvxtiuMwx3w@public.gmane.org> wrote: > The simple way to do this is just to use offset and limit. > Fetch 1k objects at a time, process, write to disk, grab the next 1k and > so forth. .. as documented by Jamis Buck in: http://weblog.jamisbuck.org/2007/4/6/faking-cursors-in-activerecord Alain Ravet -------- http://blog.ravet.com --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group. To post to this group, send email to rubyonrails-talk-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org To unsubscribe from this group, send email to rubyonrails-talk-unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org For more options, visit this group at http://groups.google.com/group/rubyonrails-talk?hl=en -~----------~----~----~----~------~----~------~--~---