I''m running a benchmark to check database performance.
I don''t quite understand the results.
The benchmarking code looks like this
class DBTable < ActiveRecord::Base
ActiveRecord::Base.establish_connection(
:adapter => "mysql",
:socket => "/var/lib/mysql/mysql.sock",
:username => "..user..",
:password => "..pwd..",
:database => "..db.."
)
def self.add_random_entry()
DBTable.new { |entry|
entry.column1 = ..some_randomly_generated_value..
entry.column2 = ..some_randomly_generated_value..
..other columns..
entry.save
end
end
Benchmark.bm do |reporter|
reporter.report { 1000.times do
DBTable.add_new_random
end
}
end
The results I see are like this
user system total real
8.900000 0.383333 9.283333 (290.029662)
which is the output of the Benchmark class
(http://rubymanual.org/module/Benchmark)
Where does the large difference between the "total" time
and the "real" time come from?
When this code runs, my Fedora Core 5 workstations is pretty
busy (KDE doesn''t respond well); still "top" shows no
swapping,
mysqld at 10% of processor time, sometimes my ruby process at
5% of processor time.
I want to see what kind of response time to expect from my Rails
application once the database has grown. For now, the benchmark
results are discouraging me from using mysql.
Is there something I''m not measuring right?
Stephan
--
Posted via http://www.ruby-forum.com/.