I am having big problems with an application handling big datasets through activerecord. I need to handle about 10000 records and the memory usage of my mongrel process reaches up to 700MB. The memory is not freed when the controller terminates. The controller just renders a big dataset to xml and outputs the data to a FLEX frontend. I''ve used both the rxml templates, .to_xml() function and also using an .rhtml go generate xml ( not very neat ... but much faster than the rxml.. nearly up to 3 times ) I really don''t understand if this is normal or I am missing something. Reading through the mailing list I''m afraid there isn''t much to do... since this is a problem with Rails ( or Ruby ) that doesn''t release memory to the OS. I can rewrite my code to paginate the dataset during data browsing, but when a report needs to be printed I need to extract the whole dataset.. Any suggestions, I''m currently running monit to kill oversized mongrel processes... but its not a solution. Must I go back to PHP in this situation..??? I really hope not. Thanks for any suggestion. Massimo Santoli --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group. To post to this group, send email to rubyonrails-talk-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org To unsubscribe from this group, send email to rubyonrails-talk-unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org For more options, visit this group at http://groups.google.com/group/rubyonrails-talk?hl=en -~----------~----~----~----~------~----~------~--~---
http://blog.evanweaver.com/files/doc/fauna/bleak_house/files/README.html On 8/24/07, msantoli <msantoli-Re5JQEeQqe8AvxtiuMwx3w@public.gmane.org> wrote:> > I am having big problems with an application handling big datasets > through activerecord. > I need to handle about 10000 records and the memory usage of my > mongrel process reaches up to 700MB. > The memory is not freed when the controller terminates. > The controller just renders a big dataset to xml and outputs the data > to a FLEX frontend. I''ve used both the rxml templates, .to_xml() > function and also using an .rhtml go generate xml ( not very neat ... > but much faster than the rxml.. nearly up to 3 times ) > > I really don''t understand if this is normal or I am missing something. > Reading through the mailing list I''m afraid there isn''t much to do... > since this is a problem with Rails ( or Ruby ) that doesn''t release > memory to the OS. > > I can rewrite my code to paginate the dataset during data browsing, > but when a report needs to be printed I need to extract the whole > dataset.. > > Any suggestions, I''m currently running monit to kill oversized mongrel > processes... but its not a solution. > > Must I go back to PHP in this situation..??? I really hope not. > > Thanks for any suggestion. > > Massimo Santoli > > > > >-- Cheers! - Pratik http://m.onkey.org --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group. To post to this group, send email to rubyonrails-talk-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org To unsubscribe from this group, send email to rubyonrails-talk-unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org For more options, visit this group at http://groups.google.com/group/rubyonrails-talk?hl=en -~----------~----~----~----~------~----~------~--~---
This isn''t necessarily a leak, and many scripting languages only begrudgingly give internal memory back to the OS. The idea is that you''ll get better performance by keeping the memory allocated as it''s likely it''ll be needed again, i.e. for the next request. A leak, IMHO, would be where the memory in use goes up with each and every request. And, even then, it''s not necessarily a leak *if* its the way the language is supposed to work. :-) -- -- Tom Mornini, Co-CEO -- Engine Yard, Ruby on Rails Hosting -- Support, Scalability, Reliability -- (866) 518-YARD (9273) x201 On Aug 24, 2007, at 2:42 PM, msantoli wrote:> > I am having big problems with an application handling big datasets > through activerecord. > I need to handle about 10000 records and the memory usage of my > mongrel process reaches up to 700MB. > The memory is not freed when the controller terminates. > The controller just renders a big dataset to xml and outputs the data > to a FLEX frontend. I''ve used both the rxml templates, .to_xml() > function and also using an .rhtml go generate xml ( not very neat ... > but much faster than the rxml.. nearly up to 3 times ) > > I really don''t understand if this is normal or I am missing something. > Reading through the mailing list I''m afraid there isn''t much to do... > since this is a problem with Rails ( or Ruby ) that doesn''t release > memory to the OS. > > I can rewrite my code to paginate the dataset during data browsing, > but when a report needs to be printed I need to extract the whole > dataset.. > > Any suggestions, I''m currently running monit to kill oversized mongrel > processes... but its not a solution. > > Must I go back to PHP in this situation..??? I really hope not. > > Thanks for any suggestion. > > Massimo Santoli > > > >On Aug 24, 2007, at 2:42 PM, msantoli wrote:> I am having big problems with an application handling big datasets > through activerecord. > I need to handle about 10000 records and the memory usage of my > mongrel process reaches up to 700MB. > The memory is not freed when the controller terminates. > The controller just renders a big dataset to xml and outputs the data > to a FLEX frontend. I''ve used both the rxml templates, .to_xml() > function and also using an .rhtml go generate xml ( not very neat ... > but much faster than the rxml.. nearly up to 3 times ) > > I really don''t understand if this is normal or I am missing something. > Reading through the mailing list I''m afraid there isn''t much to do... > since this is a problem with Rails ( or Ruby ) that doesn''t release > memory to the OS. > > I can rewrite my code to paginate the dataset during data browsing, > but when a report needs to be printed I need to extract the whole > dataset.. > > Any suggestions, I''m currently running monit to kill oversized mongrel > processes... but its not a solution. > > Must I go back to PHP in this situation..??? I really hope not. > > Thanks for any suggestion. > > Massimo Santoli--~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group. To post to this group, send email to rubyonrails-talk-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org To unsubscribe from this group, send email to rubyonrails-talk-unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org For more options, visit this group at http://groups.google.com/group/rubyonrails-talk?hl=en -~----------~----~----~----~------~----~------~--~---
bmunat-Re5JQEeQqe8AvxtiuMwx3w@public.gmane.org
2007-Aug-25 03:57 UTC
Re: Rails does it leak or not?
I would not try to instantiate 10,000 ActiveRecord objects. Hell, I would try to avoid instantiating 10,000 of *any* object. If you are just taking data from a database and poking fields into the appropriate spots in an xml schema, I would try to get the http socket output as close to the database as possible. But as a first step, try skippin ActiveRecord and query the db directly (using the connection that you can get from ActiveRecord). Take that result -- which is an array of hashes, iirc -- and use that in your template. But I don''t know how much that''s going to save you really. Ideally you would find some way to stream the database result into the template and out the http socket in one step. That''s not easy to do in rails... On the other hand, whatchya whining about? I''m still keeping an eye on a fairly small Java app that has a fairly high number of steady users and the Java process on that machine stays pretty steady at 1.2 GB ram usage. :-) b On Aug 24, 3:34 pm, Tom Mornini <tmorn...-W/9V78bTXriB+jHODAdFcQ@public.gmane.org> wrote:> This isn''t necessarily a leak, and many scripting languages only > begrudgingly give internal memory back to the OS. > > The idea is that you''ll get better performance by keeping the memory > allocated as it''s likely it''ll be needed again, i.e. for the next > request. > > A leak, IMHO, would be where the memory in use goes up with each and > every request. And, even then, it''s not necessarily a leak *if* its > the way the language is supposed to work. :-) > > -- > -- Tom Mornini, Co-CEO > -- Engine Yard, Ruby on Rails Hosting > -- Support, Scalability, Reliability > -- (866) 518-YARD (9273) x201 > > On Aug 24, 2007, at 2:42 PM, msantoli wrote: > > > > > > > I am having big problems with an application handling big datasets > > through activerecord. > > I need to handle about 10000 records and the memory usage of my > > mongrel process reaches up to 700MB. > > The memory is not freed when the controller terminates. > > The controller just renders a big dataset to xml and outputs the data > > to a FLEX frontend. I''ve used both the rxml templates, .to_xml() > > function and also using an .rhtml go generate xml ( not very neat ... > > but much faster than the rxml.. nearly up to 3 times ) > > > I really don''t understand if this is normal or I am missing something. > > Reading through the mailing list I''m afraid there isn''t much to do... > > since this is a problem with Rails ( or Ruby ) that doesn''t release > > memory to the OS. > > > I can rewrite my code to paginate the dataset during data browsing, > > but when a report needs to be printed I need to extract the whole > > dataset.. > > > Any suggestions, I''m currently running monit to kill oversized mongrel > > processes... but its not a solution. > > > Must I go back to PHP in this situation..??? I really hope not. > > > Thanks for any suggestion. > > > Massimo Santoli > > On Aug 24, 2007, at 2:42 PM, msantoli wrote: > > > I am having big problems with an application handling big datasets > > through activerecord. > > I need to handle about 10000 records and the memory usage of my > > mongrel process reaches up to 700MB. > > The memory is not freed when the controller terminates. > > The controller just renders a big dataset to xml and outputs the data > > to a FLEX frontend. I''ve used both the rxml templates, .to_xml() > > function and also using an .rhtml go generate xml ( not very neat ... > > but much faster than the rxml.. nearly up to 3 times ) > > > I really don''t understand if this is normal or I am missing something. > > Reading through the mailing list I''m afraid there isn''t much to do... > > since this is a problem with Rails ( or Ruby ) that doesn''t release > > memory to the OS. > > > I can rewrite my code to paginate the dataset during data browsing, > > but when a report needs to be printed I need to extract the whole > > dataset.. > > > Any suggestions, I''m currently running monit to kill oversized mongrel > > processes... but its not a solution. > > > Must I go back to PHP in this situation..??? I really hope not. > > > Thanks for any suggestion. > > > Massimo Santoli--~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group. To post to this group, send email to rubyonrails-talk-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org To unsubscribe from this group, send email to rubyonrails-talk-unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org For more options, visit this group at http://groups.google.com/group/rubyonrails-talk?hl=en -~----------~----~----~----~------~----~------~--~---