Linteau, Alexandre
2018-Dec-28 21:51 UTC
[opus] Compromise between bitrate, quality and latency
How does the compromise between bitrate, latency and quality work out? Assuming a capacity to deliver a high bitrate, is it possible to achieve EXACTLY the same level of quality at 5 ms as with 20 ms latency? If so, what would be the resulting bitrate? How can I work out the relation between those 3 parameters? I am currently working on an application where lowest latency is necessary, while keeping the highest level of quality. Streaming HD audio (at least audio quality), with a quality almost like uncompressed, but at lowest latency possible. Bitrate is not a big concern (as long as under 500 kbps). -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.xiph.org/pipermail/opus/attachments/20181228/7a525790/attachment-0001.html>