hi
I was wondering how many difference in % of cpu load (on same cpu) exists
between compression level 1 and 8 while decoding. Also same question for
encoding.
Is the difference in % of cpu load while decoding/encoding the reason
different compression levels were created or another reason?
thx
--
'C makes it easy to shoot yourself in the foot; C++ makes it harder, but
when you do, it blows away your whole leg.'
--Bjarne Stroustrup, The Creator of C++
'And so at last the beast fell and the unbelievers rejoiced. But all was not
lost, for from the ash rose a great bird. The bird gazed down upon the
unbelievers and cast fire
and thunder upon them. For the beast had been reborn with its strength
renewed, and the
followers of Mammon cowered in horror.' --From The Book of Mozilla, 7:15
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
http://lists.xiph.org/pipermail/flac/attachments/20080117/5309363b/attachment.html