Hi, I''m not sure if this is a postgres issue or not, but I''m using Postgres 8.1. I have (in my functional tests) the following code: upload = fixture_file_upload(''/files/podcast.mp3'', ''audio/mpeg'') post :create, :product => valid_product, :media => { :image => upload } And then in the controller: blob = Blob.create :data => uploaded_file.read When /files/podcast.mp3 is 32 megabytes and I run the test, memory spikes hugely. My laptop quickly began swapping out memory and became unusable. top showed the memory usage of the Ruby process to be about 400MB and growing. Is there something stupid that I''m doing? Joe
On Apr 30, 2006, at 6:04 PM, Joe Van Dyk wrote:> Hi, > > I''m not sure if this is a postgres issue or not, but I''m using > Postgres 8.1. I have (in my functional tests) the following code: > > upload = fixture_file_upload(''/files/podcast.mp3'', ''audio/mpeg'') > post :create, :product => valid_product, :media => { :image => > upload } > > And then in the controller: > blob = Blob.create :data => uploaded_file.read > > When /files/podcast.mp3 is 32 megabytes and I run the test, memory > spikes hugely. My laptop quickly began swapping out memory and became > unusable. top showed the memory usage of the Ruby process to be about > 400MB and growing. > > Is there something stupid that I''m doing?Are you using a blob field type rather than the postgresql large objects? -Robby Robby Russell Founder & Executive Director PLANET ARGON, LLC Ruby on Rails Development, Consulting & Hosting www.planetargon.com www.robbyonrails.com +1 503 445 2457 +1 877 55 ARGON [toll free] +1 815 642 4968 [fax]
On 4/30/06, Robby Russell <robby.lists@planetargon.com> wrote:> > On Apr 30, 2006, at 6:04 PM, Joe Van Dyk wrote: > > > Hi, > > > > I''m not sure if this is a postgres issue or not, but I''m using > > Postgres 8.1. I have (in my functional tests) the following code: > > > > upload = fixture_file_upload(''/files/podcast.mp3'', ''audio/mpeg'') > > post :create, :product => valid_product, :media => { :image => > > upload } > > > > And then in the controller: > > blob = Blob.create :data => uploaded_file.read > > > > When /files/podcast.mp3 is 32 megabytes and I run the test, memory > > spikes hugely. My laptop quickly began swapping out memory and became > > unusable. top showed the memory usage of the Ruby process to be about > > 400MB and growing. > > > > Is there something stupid that I''m doing? > > Are you using a blob field type rather than the postgresql large > objects?Probably. Whatever using the :binary thingy in migrations gives me. Joe
On 4/30/06, Joe Van Dyk <joevandyk@gmail.com> wrote:> On 4/30/06, Robby Russell <robby.lists@planetargon.com> wrote: > > > > On Apr 30, 2006, at 6:04 PM, Joe Van Dyk wrote: > > > > > Hi, > > > > > > I''m not sure if this is a postgres issue or not, but I''m using > > > Postgres 8.1. I have (in my functional tests) the following code: > > > > > > upload = fixture_file_upload(''/files/podcast.mp3'', ''audio/mpeg'') > > > post :create, :product => valid_product, :media => { :image => > > > upload } > > > > > > And then in the controller: > > > blob = Blob.create :data => uploaded_file.read > > > > > > When /files/podcast.mp3 is 32 megabytes and I run the test, memory > > > spikes hugely. My laptop quickly began swapping out memory and became > > > unusable. top showed the memory usage of the Ruby process to be about > > > 400MB and growing. > > > > > > Is there something stupid that I''m doing? > > > > Are you using a blob field type rather than the postgresql large > > objects? > > Probably. Whatever using the :binary thingy in migrations gives me.Even a 10 megabyte file makes the Rails process consume a hell of a lot of memory. And then postmaster consumes quite a lot as well.
On Apr 30, 2006, at 6:48 PM, Joe Van Dyk wrote:> Even a 10 megabyte file makes the Rails process consume a hell of a > lot of memory. And then postmaster consumes quite a lot as well.Which postgresql adapter are you using? postgres or postgres-pr ? It''d be my guess that postgres-pr isn''t doing its garbage collecting or something... -Robby Robby Russell Founder & Executive Director PLANET ARGON, LLC Ruby on Rails Development, Consulting & Hosting www.planetargon.com www.robbyonrails.com +1 503 445 2457 +1 877 55 ARGON [toll free] +1 815 642 4968 [fax]
On 4/30/06, Robby Russell <robby.lists@planetargon.com> wrote:> > On Apr 30, 2006, at 6:48 PM, Joe Van Dyk wrote: > > > Even a 10 megabyte file makes the Rails process consume a hell of a > > lot of memory. And then postmaster consumes quite a lot as well. > > Which postgresql adapter are you using? > > postgres or postgres-pr ? > > It''d be my guess that postgres-pr isn''t doing its garbage collecting > or something...postgres. On linux.
> -----Original Message----- > From: rails-bounces@lists.rubyonrails.org > [mailto:rails-bounces@lists.rubyonrails.org] On Behalf Of Joe Van Dyk > Sent: Sunday, April 30, 2006 7:04 PM > To: rails@lists.rubyonrails.org > Subject: [Rails] large file storing in postgres sucks? > > > Hi, > > I''m not sure if this is a postgres issue or not, but I''m > using Postgres 8.1. I have (in my functional tests) the > following code: > > upload = fixture_file_upload(''/files/podcast.mp3'', ''audio/mpeg'') > post :create, :product => valid_product, :media => { > :image => upload } > > And then in the controller: > blob = Blob.create :data => uploaded_file.read > > When /files/podcast.mp3 is 32 megabytes and I run the test, > memory spikes hugely. My laptop quickly began swapping out > memory and became unusable. top showed the memory usage of > the Ruby process to be about 400MB and growing. > > Is there something stupid that I''m doing? > > JoeI don''t think I would stick mp3 files directly into a database, regardless of vendor. I think I''d just let the mp3 files sit on the filesystem somewhere, and have an entry in the database that points to the appropriate file. Use controller methods for uploading and downloading. Just my .02. Regards, Dan This communication is the property of Qwest and may contain confidential or privileged information. Unauthorized use of this communication is strictly prohibited and may be unlawful. If you have received this communication in error, please immediately notify the sender by reply e-mail and destroy all copies of the communication and any attachments.
On 5/1/06, Berger, Daniel <Daniel.Berger@qwest.com> wrote:> > -----Original Message----- > > From: rails-bounces@lists.rubyonrails.org > > [mailto:rails-bounces@lists.rubyonrails.org] On Behalf Of Joe Van Dyk > > Sent: Sunday, April 30, 2006 7:04 PM > > To: rails@lists.rubyonrails.org > > Subject: [Rails] large file storing in postgres sucks? > > > > > > Hi, > > > > I''m not sure if this is a postgres issue or not, but I''m > > using Postgres 8.1. I have (in my functional tests) the > > following code: > > > > upload = fixture_file_upload(''/files/podcast.mp3'', ''audio/mpeg'') > > post :create, :product => valid_product, :media => { > > :image => upload } > > > > And then in the controller: > > blob = Blob.create :data => uploaded_file.read > > > > When /files/podcast.mp3 is 32 megabytes and I run the test, > > memory spikes hugely. My laptop quickly began swapping out > > memory and became unusable. top showed the memory usage of > > the Ruby process to be about 400MB and growing. > > > > Is there something stupid that I''m doing? > > > > Joe > > I don''t think I would stick mp3 files directly into a database, > regardless of vendor. I think I''d just let the mp3 files sit on > the filesystem somewhere, and have an entry in the database that > points to the appropriate file. Use controller methods for > uploading and downloading.I would like to figure out a way to have everything be in the database though. Would make doing backups pretty darned easy. But maybe filesystem is the way to go.
On Monday 01 May 2006 02:04, Joe Van Dyk wrote:> When /files/podcast.mp3 is 32 megabytes and I run the test, memory > spikes hugely. My laptop quickly began swapping out memory and became > unusable. top showed the memory usage of the Ruby process to be about > 400MB and growing. > > Is there something stupid that I''m doing?Perhaps consider using the file_column extension instead of storing the files in the database: http://www.kanthak.net/opensource/file_column/ Cheers, ~Dave -- Dave Silvester Rent-A-Monkey Website Development http://www.rentamonkey.com/ PGP Key: http://www.rentamonkey.com/pgpkey.asc
Dave Silvester <dave@...> writes:> > On Monday 01 May 2006 02:04, Joe Van Dyk wrote: > > When /files/podcast.mp3 is 32 megabytes and I run the test, memory > > spikes hugely. My laptop quickly began swapping out memory and became > > unusable. top showed the memory usage of the Ruby process to be about > > 400MB and growing. > > > > Is there something stupid that I''m doing? > > Perhaps consider using the file_column extension instead of storing the files > in the database: > > http://www.kanthak.net/opensource/file_column/ > > Cheers, > > ~Dave >I''m also having this issue with a MYSQL database. Before the upload, my ruby process memory is at 19-22mb. When I upload a 37mb file, memory soon increases to as high as 500mb and stays there. One thing I have noticed is that manually invoking the garbage collector via GC.start right after the upload brings memory down to 100mb.
Larry Diehl wrote:> Dave Silvester <dave@...> writes: > >> On Monday 01 May 2006 02:04, Joe Van Dyk wrote: >>> When /files/podcast.mp3 is 32 megabytes and I run the test, memory >>> spikes hugely. My laptop quickly began swapping out memory and became >>> unusable. top showed the memory usage of the Ruby process to be about >>> 400MB and growing. >>> >>> Is there something stupid that I''m doing? >> Perhaps consider using the file_column extension instead of storing the files >> in the database: >> >> http://www.kanthak.net/opensource/file_column/ >> >> Cheers, >> >> ~Dave >> > > I''m also having this issue with a MYSQL database. Before the upload, my ruby > process memory is at 19-22mb. When I upload a 37mb file, memory soon increases > to as high as 500mb and stays there. > One thing I have noticed is that manually invoking the garbage collector via > GC.start right after the upload brings memory down to 100mb.I don''t seem to have received the earlier messages in this thread... There was some discussion of ActiveRecord''s memory use when handling BLOBs over a year ago, triggered by this article: http://www.voxclandestina.com/ruby/rails/2005-05-24/improving-activerecord-part-1 in a thread "Improving ActiveRecord Part 1" started at 8:09pm on 24th May 2005. I don''t know if things have changed, but the article pointed out how Rails made multiple copies of large data items on their way to the database. That wouldn''t explain why memory use continues to grow, though. regards Justin