Yannick "Modah" Gouez
2013-Aug-03 07:56 UTC
[Icecast-dev] How to use http-put for JavaScript source client
Following up on this topic ( sorry if this starts a new thread but I just joined the ml ), I do no understand why it is not possible to use the audio stream from webRTC's getUserMedia and then send it over a websocket ? It seems that the webRTC implementation can natively encode in ogg format in stereo from any interface ( according to https://github.com/muaz-khan/WebRTC-Experiment/tree/master/RecordRTC ). Why wouldnt it be suitable ? In the other, I assume, like Jamie, that there is a way to send it over a PUT request. Isnt it ? y.> Hi Jamie, > > The webRTC API does not sound suitable for source->server streaming > for many reason. For instance, the peer-to-peer connection requires > input from both end and seems quite unfeasible to implement in a > server. Likewise, codecs are completely abstracted and much more. > > In reality, webRTC is an API to acheive full-duplex conversations a-la > skype and not for streaming. > > For these reasons, we at liquidsoap have been working on implementing > a simple websocket protocol for sending source streams from a browser > to a server. The protocol is documented and implemented there: > https://github.com/savonet/webcast > > We also have a pull request on liquidsoap that implements the protocol > and should be merged fairly soon: > https://github.com/savonet/liquidsoap/pull/90 > > The bottlenecks right now are the availability of the Web Audio API, > which is only partially implemented in firefox and the encoding speed. > > Because there is no native encoding API for browser-side javascript, > we have (temporarily?) resorted to using javascript-compiled libraries > for mp3 encoding. However, only firefox seems to show suitable > performances for mp3 encoding, using the libshine build and thanks to > its asm.js support. > https://github.com/savonet/shine/tree/master/js > > On the other hand, only chrome implements the adequate Web Audio API, > but is too slow to encode :-o > > All in all, if we keep forging, it is very likely that once mozilla > finishes implementing the web audio API, we should have a function > browser source client using firefox. And Chrome when their asm.js > perfs improve as well. > > Romain > 2013/7/23 Jamie McClelland <jm at mayfirst.org <http://lists.xiph.org/mailman/listinfo/icecast-dev>>: > >* I'm following up on a thread started by Stephen a couple months ago about*>* building a JavaScript source client using webrtc.*>**>* The first step suggested was to figure out how to mux the audio and video.*>* After I posted a feature request on the webrtc experiment js library, we*>* seem to have a solution:*>* https://github.com/muaz-khan/WebRTC-Experiment/issues/28#issuecomment-20791759*>**>* Based on the last comment on the icecast Dec list, we now only need to do do*>* an HTTP put request to the icecast server (*>* http://lists.xiph.org/pipermail/icecast-dev/2013-May/002171.html).*>**>* Great! I've got my jquery ready, but am having trouble finding docs on how*>* to build the put request. I tried looking at the libshout source, but my c*>* skills aren't quite good enough to figure it out.*>**>* Any help would be appreciated, particularly with an example.*>**>* Thanks for all your work on icecast - we use it a lot here at May*>* First/People Link.*>**>* Jamie*>**>* _______________________________________________*>* Icecast-dev mailing list*>* Icecast-dev at xiph.org <http://lists.xiph.org/mailman/listinfo/icecast-dev>*>* http://lists.xiph.org/mailman/listinfo/icecast-dev*> > >-------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.xiph.org/pipermail/icecast-dev/attachments/20130803/6d6626cb/attachment.htm
"Thomas B. Rücker"
2013-Aug-03 08:08 UTC
[Icecast-dev] How to use http-put for JavaScript source client
Hi Yannick, On 08/03/2013 07:56 AM, Yannick "Modah" Gouez wrote:> Following up on this topic ( sorry if this starts a new thread but I > just joined the ml ),No problem, but _please_ do not post HTML to mailing lists, thanks.> I do no understand why it is not possible to use the audio stream from > webRTC's getUserMedia and then send it over a websocket ?That seems to refer to Romain's statement. I can't comment on that as I don't follow his view.> It seems that the webRTC implementation can natively encode in ogg > format in stereo from any interface ( according to > https://github.com/muaz-khan/WebRTC-Experiment/tree/master/RecordRTC ). > Why wouldnt it be suitable ?Please note that there is a important difference between *container* and /codec/. Ogg is the container. Opus and Vorbis are codecs commonly found in Ogg containers. Just to make sure this is clear.