Hi, I'm currently implementing OggSpots [1] support for VLC. In the "spec" it says that the granulerate is to be interpreted as in Ogg Skeleton so it should be possible to calculate `granulepos / granulerate` to get time. But it also says: The default granule rate for OggSpots is: 1/30 (30 frames per second resolution). To me that doesn't make sense. If we want to specify 30 granules per second the granulerate should be 30/1 (i.e. the inverse). So the calculation becomes X granules / (30 granules / 1 s) == X granules * 1 s / 30 granules == (X / 30) s Is there an error in my thinking or is the spec wrong? Problem: I have encountered files produced by an implementation that kind of uses that wrong value: The value in the OggSpots header is 1/30 but the granule positions only make sense if the rate actually used is 30 granules per second. So the question is how to handle this situation. My proposal would be to update the spec to say the default is 30/1 and put in special case detection code in my implementation so if it encounters 1/30 it just takes it to mean 30/1. The other interpretation would mean that you could only encode an image every 30 seconds, which is not an absurd value but also not very likely. -- Cheers, Michael [1]: https://wiki.xiph.org/OggSpots -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 473 bytes Desc: OpenPGP digital signature Url : http://lists.xiph.org/pipermail/ogg-dev/attachments/20160118/b7449b69/attachment.pgp