I have a script that I am using to populate a database from a CSV file: RAILS_ENV = ''development'' require File.dirname(__FILE__) + ''/../../config/environment'' require ''csv'' # Destroy existing data Residence.destroy_all # IMPORT DATA FROM apartments.csv db_file = "#{RAILS_ROOT}/db/apartments.csv" CSV.open(db_file, "r", ''|'') do |row| r = Residence.new r.name = row[0] r.number = row[1] r.street = row[2] r.zip = row[3] r.save end When I run this script it correctly destroys all of the existing data, but new rows are created with increasingly large ID number -- ie the "old" IDs are not reused. Has anybody seen this problem ? Perhaps I am misunderstanding the destroy_all method ? I would like the new rows to begin with ID = 1 . My "Residence" model has the following line, if it is revelevant: has_many :comments, :dependent => :destroy I would appreciate any suggestions, Chris -- Posted via http://www.ruby-forum.com/.
I don''t know if there''s a rails-specific answer to this or not, but that''s a property of the underlying database, not rails. In general, IDENTITY columns get auto_incremented, and id #s will not be re-used unless the table is specifically re-seeded. Check the documentation for your DB and see how to re-seed the identity column for that table. Hope that helps -----Original Message----- From: rails-bounces@lists.rubyonrails.org [mailto:rails-bounces@lists.rubyonrails.org] On Behalf Of Chris Lowis Sent: Monday, July 31, 2006 4:28 PM To: rails@lists.rubyonrails.org Subject: [Rails] Problems with ever-increasing ID value I have a script that I am using to populate a database from a CSV file: RAILS_ENV = ''development'' require File.dirname(__FILE__) + ''/../../config/environment'' require ''csv'' # Destroy existing data Residence.destroy_all # IMPORT DATA FROM apartments.csv db_file = "#{RAILS_ROOT}/db/apartments.csv" CSV.open(db_file, "r", ''|'') do |row| r = Residence.new r.name = row[0] r.number = row[1] r.street = row[2] r.zip = row[3] r.save end When I run this script it correctly destroys all of the existing data, but new rows are created with increasingly large ID number -- ie the "old" IDs are not reused. Has anybody seen this problem ? Perhaps I am misunderstanding the destroy_all method ? I would like the new rows to begin with ID = 1 . My "Residence" model has the following line, if it is revelevant: has_many :comments, :dependent => :destroy I would appreciate any suggestions, Chris -- Posted via http://www.ruby-forum.com/. _______________________________________________ Rails mailing list Rails@lists.rubyonrails.org http://lists.rubyonrails.org/mailman/listinfo/rails
On Jul 31, 2006, at 9:27 pm, Chris Lowis wrote:> I have a script that I am using to populate a database from a CSV > file: > > RAILS_ENV = ''development'' > require File.dirname(__FILE__) + ''/../../config/environment'' > > require ''csv'' > > # Destroy existing data > Residence.destroy_all > > # IMPORT DATA FROM apartments.csv > db_file = "#{RAILS_ROOT}/db/apartments.csv" > > CSV.open(db_file, "r", ''|'') do |row| > > r = Residence.new > r.name = row[0] > r.number = row[1] > r.street = row[2] > r.zip = row[3] > > r.save > end > > When I run this script it correctly destroys all of the existing data, > but new rows are created with increasingly large ID number -- ie the > "old" IDs are not reused. Has anybody seen this problem ? Perhaps I am > misunderstanding the destroy_all method ? I would like the new rows to > begin with ID = 1 . > > My "Residence" model has the following line, if it is revelevant: > has_many :comments, :dependent => :destroy > > I would appreciate any suggestions, > > ChrisChris, Your database will deliberately does not re-use IDs. You didn''t say what database server you are using, so I don''t know how to override it, but it will be in the docs. For example, SQL Server has an IDENTITY INSERT option, and I think in Postgres you can reset the sequence number (never actually needed to do that). Don''t know about MySQL. In al cases you will have to call hand-rolled SQL to do this using the execute method. Depending on how much data you are storing, it might not be a problem. A 32-bit int can store positive values up to around 2 billion, so you would need to be doing massive bulk loads on a regular basis to run into the limit any time soon. Ashley
Ashley and Peter,> Depending on how much data you are storing, it might not be a > problem. A 32-bit int can store positive values up to around 2 > billion, so you would need to be doing massive bulk loads on a > regular basis to run into the limit any time soon.Thank you both for your prompt help, I understand now that this is a database issue. I am using MySQL . Perhaps I can get some inspiration from the "migrations" code. As you suggest I won''t be running out of IDs soon ! It''s more of an annoyance than anything as my URLs have /show/{some large number} in them ... perhaps something I need to fix with routing. Kind regards, Chris -- Posted via http://www.ruby-forum.com/.
Steven Talcott Smith
2006-Jul-31 22:40 UTC
[Rails] Re: Problems with ever-increasing ID value
Chris Lowis wrote:> When I run this script it correctly destroys all of the existing data, > but new rows are created with increasingly large ID number -- ie the > "old" IDs are not reused. Has anybody seen this problem ? Perhaps I am > misunderstanding the destroy_all method ? I would like the new rows to > begin with ID = 1 ....> I would appreciate any suggestions,Chris, I have encountered this problem during development while using migrations to modify a legacy db. I could not find an easy way to reset the sequence in MySQL so I resorted to dropping the table, recreating it and repopulating. Steven -- Posted via http://www.ruby-forum.com/.
A database reusing IDs is bad bad bad, so it''s not an "issue." If you''re afraid of running out of IDs, use bigints. If you really want to "compact" the IDs, a painstaking export/recreate database/import is possible. Joe -- Posted via http://www.ruby-forum.com/.
On Jul 31, 2006, at 9:42 pm, Chris Lowis wrote:> Thank you both for your prompt help, I understand now that this is a > database issue. I am using MySQL . Perhaps I can get some inspiration > from the "migrations" code. As you suggest I won''t be running out > of IDs > soon !I did try to get on the MySQL site just, to satisfy my curiosity about how you reset the IDs, but the whole site is running like a dog and I gave up. Must be powered by MySQL ;o) It will explain on there somewhere though.> It''s more of an annoyance than anything as my URLs have /show/{some > large number} in them ... perhaps something I need to fix with > routing.I wouldn''t do anything complicated to work round the issue you''ll probably tie yourself up in knots. What you could do is update existing records rather than deleting them and re-inserting, but I''d only do this if they actually represent the same apartment - I can''t think of anything more confusing than a database ID or URL that starts randomly changing it meaning! Or, you could add your own application-generated id column, maybe "url_id", that you could increment with every row in the csv, but I personally don''t like that idea much. Ashley
On 7/31/06, Chris Lowis <chris.lowis@gmail.com> wrote:> Thank you both for your prompt help, I understand now that this is a > database issue. I am using MySQL . Perhaps I can get some inspiration > from the "migrations" code. As you suggest I won''t be running out of IDs > soon ! > > It''s more of an annoyance than anything as my URLs have /show/{some > large number} in them ... perhaps something I need to fix with routing.execute("alter table table_name auto_increment = 1") should do the trick. See http://dev.mysql.com/doc/refman/5.0/en/example-auto-increment.html for more information. Mark
Thank you everyone for your suggestions. I think this is something I don''t need to get too worried about at the moment, when it comes to production I will make an initial import of all the data, and will have "nice" ID numbers again ! I use migrations a lot, so I''m a little reluctant to drop tables from another script, but I will give the MySQL code a try later. Kind regards, Chris -- Posted via http://www.ruby-forum.com/.
Chris Lowis wrote:> When I run this script it correctly destroys all of the existing data, > but new rows are created with increasingly large ID number -- ie the > "old" IDs are not reused. Has anybody seen this problem ? Perhaps I am > misunderstanding the destroy_all method ? I would like the new rows to > begin with ID = 1 .Look up the MySQL "TRUNCATE" command. It removes all data from a table and restarts the autoincrement(s) from 1. --Al Evans -- Posted via http://www.ruby-forum.com/.