similar to: My Apologies

Displaying 20 results from an estimated 20000 matches similar to: "My Apologies"

2014 Mar 27
0
Multistream Framing
Are multistream packets limited to one frame per stream in a given packet? -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.xiph.org/pipermail/opus/attachments/20140326/acfa13ab/attachment.htm
2019 Aug 01
0
Opus 1.3 different default bitrate between opus encoder and opus multistream encoder
I use the Opus multistream encoder for both mono and stereo encodings and after updating from 1.1.3 to 1.3 I noticed the size of the produced Opus files had doubled for 1-channel encodings whereas switching to the standand Opus encoder gave me roughly the same sizes as before. In these tests, I'm encoding an 8kHz mono stream containing only speech with the following options set: frame length
2015 Oct 14
0
How to wrap Opus data in an Ogg stream?
On Wed, Oct 14, 2015 at 4:18 AM, Daniel Armyr <daniel at armyr.se> wrote: > Hi. > I am trying to understand how to package opus-encoded data in an ogg stream to make a standards-compilant ogg/opus file. Most things are clear, but there is one thing I simply do not understand. > > What I do understand is that Ogg is based on pages, that each come with a fairly sizeable header
2015 Oct 14
2
How to wrap Opus data in an Ogg stream?
Hi. I am trying to understand how to package opus-encoded data in an ogg stream to make a standards-compilant ogg/opus file. Most things are clear, but there is one thing I simply do not understand. What I do understand is that Ogg is based on pages, that each come with a fairly sizeable header (>=27 bytes). Now, I encode with a fairly low bitrate, so my opus packages are <100 bytes long.
2013 Oct 26
0
libopus API question - 120ms encoding
On 10/26/2013 01:11 PM, Wang, Chris wrote: > A simpler question. How does opus_encode() generate packets of 20ms > (SILK-only or Hybrid)? Concatenating two 10ms frames or doing it > straight with just one 20ms frame? Just one 20 ms frame. It always returns a single frame except when it just can't (e.g. 60 ms CELT). > From your explanations below, opus_encode() will concatenate
2016 Apr 26
0
Antw: [opus-tools] [PATCH] Add channel-mapping argument to force channel mapping
Hi! I haven't looked into the code yet, but the patch uses different coding conventions like "if(" and "if ("; like wise "){" and ") {". My personal taste is to have spaces after keywords, but that's just me. I'd prefer a consistent coding style. Regards, Ulrich >>> Michael Graczyk <mgraczyk at google.com> schrieb am 26.04.2016
2016 May 05
0
[PATCH] Add Functions to Create Ambisonic Multistream Encoder
Hi Michael, Is there any reason you can't just use the generic multi-stream API, i.e. opus_multistream_encoder_init() and give it the mapping you need? This is how surround was originally done (in 1.0) and only got changed when surround needed a more complex mapping and more data in the encoder. If it turns out you need this kind of thing too, then yes we would probably just want to extend
2013 Dec 02
0
Opus Multistream DTX questions
When encoding using the Opus multistream API, is it possible for opus_multistream_encode() to return 0? For example, what happens if multiple streams are being encoded, each with DTX enabled and all streams emit DTX packets at the same time? What about the case of a single stream with DTX enabled? Thanks Kevin O'Connor -------------- next part -------------- An HTML attachment was
2013 Oct 30
1
libopus API question - 120ms encoding
Thanks Jean-Marc and Benjamin for the answers. One follow-up question. If I use a repacketizer as Jean-Marc suggested by combining two 60ms frames to form a 120ms frame, without extracting individual frames and using a new TOC, I would need to have a "de-packetizer" that does the exact opposite of repacketizer. De-packetizer would need to separate this 120ms frame into two 60ms frames
2013 Jul 27
1
repacketizing unrelated frames
Hi Jean-Marc, I looked at that but importantly these streams need to remain absolutely independent, Further they may have been encoded at some previous time. So my question stands. Thanks, Marc On Jul 26, 2013, at 9:10 PMEDT, Jean-Marc Valin wrote: > Hi Marc, > > I recommend you have a look at the multistream API and how we use it for > surround in the Ogg Opus draft. Sounds
2013 Apr 11
0
No subject
ly or Hybrid frames for 40 or 60ms packet, respectively. That is based on = concatenating 20ms frames, right? Is 60ms the largest packet opus_encode() can generate? In order to get pac= kets of up to 120 ms by combining multiple frames as described in RFC6716 c= lause 2.1.4 one would need to use the "repacketizer". That is if I want to= have a 120 ms packet, I would need to take
2016 May 31
2
Patches for adding 120 ms encoding
Hi all, We (WebRTC/Google) would like to extend Opus to natively support 120 ms encoding instead of relying on repacketization as a post processing step. This is to ensure that a valid 120 ms packet is always available. I've attached a couple of patches to add this to opus_encoder(), based on the internal repacketization process carried out by 60 ms CELT. We intend to extend this later for
2017 Mar 24
2
[PATCH] Fix OPUS_ARG_NONNULL indices in opus_multistream.h
Hi all, The attached patch adds/fixes a few null argument checks in the multistream API. Do these changes make sense? Thanks, Felicia -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.xiph.org/pipermail/opus/attachments/20170324/1afee1e1/attachment.html> -------------- next part -------------- A non-text attachment was scrubbed... Name:
2019 Oct 31
1
Antw: Re: Q: Bandwidth vs. bitrate
Hi! Useful advice, thanks! Actually I had been using foobar2000 to recode, because it just makes it so easy to convert multiple files while keeping the metadata (I confess, I'm a "tagger"). But it's easy to miss some encoder option when being presented some default suggestions in a dialog form... Apart form that I always had the impression that Opus could be quite smart
2017 Nov 13
0
Gapless concatenation of Opus frames
Hi Andreas, So if I understand your question correctly, what you want is really short "files" that are independent, but yet create a glitchless stream when concatenated, right. For Ogg, this can be implemented with libopusenc and chaining. It works pretty well (even for really tiny files). For WebM, I'm not sure how to handle the details at the container level, but for how to handle
2019 Dec 19
1
opusenc for ambisonics?
Unfortunately, ambisonics aren't exposed in opusenc yet, thus the trouble. They're an API-only feature, but it's a good time to discuss what such a command-line interface would look like, notably: how to specify multiple streams & stream order, select the mapping family, coupled channels, and how to specify the matrix (for family 3). Likewise, there's no multistream support at
2016 Jun 01
2
Patches for adding 120 ms encoding
Hi Felicia, I still don't quite understand why you need to make 120 ms a special case, rather than extend the code that already handles 40 ms and 60 ms. Cheers, Jean-Marc On 06/01/2016 12:58 PM, Felicia Lim wrote: > Hi all, > > I've just realized that there's a better and simpler way of doing this > which ensures that analysis and selection of the mode/bandwidth etc
2013 Jul 27
0
repacketizing unrelated frames
Hi Marc, I recommend you have a look at the multistream API and how we use it for surround in the Ogg Opus draft. Sounds like the best way to solve your problem. Cheers, Jean-Marc On 07/26/2013 06:57 PM, Marc Lindahl wrote: > I can't quite figure this out from looking at the repacketizer code. > > Let's say I have 4 separate stereo streams (say from an 8 channel >
2015 Apr 02
1
Opus multi-stream/surround: Audio corruption on decoded content
Hello Everyone, I am using the opus 1.1 multistream APIs to encode a 5.1 surround stream on the server, stream it to client, decode it and capture the pcm data. I noticed that there was severe corruption/attenuation on one of the channels(specifically Back/Rear Right). This would appear to be the last channel in the stream. I am attaching an image of the PCM dumps from the original and the one
2016 Apr 26
3
[opus-tools] [PATCH] Add channel-mapping argument to force channel mapping
This patch adds a new option "channel-mapping" to opusenc which sets the channel mapping family used by the multistream encoder. Please let me know whether adding this option is worthwhile and whether the help string is okay. I tried to keep it short but accurate. The error message for an unimplemented channel mapping is "Error cannot create encoder: request not implemented".