Displaying 20 results from an estimated 3000 matches similar to: "How to use http-put for JavaScript source client"
2013 Jul 24
0
How to use http-put for JavaScript source client
Hi Jamie,
The webRTC API does not sound suitable for source->server streaming
for many reason. For instance, the peer-to-peer connection requires
input from both end and seems quite unfeasible to implement in a
server. Likewise, codecs are completely abstracted and much more.
In reality, webRTC is an API to acheive full-duplex conversations a-la
skype and not for streaming.
For these
2013 Apr 11
0
No subject
involved. This is easy enough to verify, dump to a file as described on
that page and feed the resulting file to 'ogginfo'.
If it is indeed properly muxed into an Ogg container, then just
forwarding it via HTTP PUT instead of writing it to a file should do the
job.
I'd suggest someone just tries it. If you need a Icecast server to test
against I can also provide that. Ping me on IRC
2013 Jul 23
5
How to use http-put for JavaScript source client
I'm following up on a thread started by Stephen a couple months ago about building a JavaScript source client using webrtc.
The first step suggested was to figure out how to mux the audio and video. After I posted a feature request on the webrtc experiment js library, we seem to have a solution: https://github.com/muaz-khan/WebRTC-Experiment/issues/28#issuecomment-20791759
Based on the last
2013 Jun 16
2
Javascript source client
Hey all,
So we have been advised from this thread
https://github.com/muaz-khan/WebRTC-Experiment/issues/28#issuecomment-18385702
to not use http put as it is not in real-time, instead they are
suggesting the use of SDP, is that something that icecast supports? Or
does anyone have other ideas on this?
~stephen
On Sun 12 May 2013 01:51:31 AM CDT, Thomas Ruecker wrote:
> Hi,
>
> On 11
2013 Jul 31
0
How to use http-put for JavaScript source client
Hi,
On 07/23/2013 07:44 PM, Jamie McClelland wrote:
> I'm following up on a thread started by Stephen a couple months ago
> about building a JavaScript source client using webrtc.
>
> The first step suggested was to figure out how to mux the audio and
> video. After I posted a feature request on the webrtc experiment js
> library, we seem to have a solution:
>
2013 Jul 25
1
How to use http-put for JavaScript source client
Thanks so much for the comprehensive response Romain. It sounds like you
are working hard getting this working with audio streams and it's a lot
more complicated than I had originally hoped.
I'm still curious about the proper syntax of the http put command though
(and I'm sure others would be interested as well). Is there any
documentation on the parameters to send?
I also had a
2016 Feb 18
2
Asterisk 13 and WebRTC. Is wiki page still valid ?
Hello,
I'm trying to have my first calls with WebRTC.
My server has asterisk 13.7.0.
I'm following the instructions from the wiki [1].
So I'm using [2] live demo from a Chrome navigator (v48) on Debian Jessie
station.
Whenever I type something like ws://123.123.123.123:8088/ws in Expert Mode
form (see [1]), I'm getting this error :
*2:SecurityError: Failed to construct
2013 Jun 17
0
Javascript source client
Hi Stephen,
> So we have been advised from this thread
> https://github.com/muaz-khan/WebRTC-Experiment/issues/28#issuecomment-18385702
> to not use http put as it is not in real-time, instead they are
> suggesting the use of SDP, is that something that icecast supports? Or
> does anyone have other ideas on this?
The imminent Airtime 2.4.0 release has support for Opus, and it
2016 Feb 18
2
Asterisk 13 and WebRTC. Is wiki page still valid ?
Thank you much for yor reply.
2016-02-18 13:30 GMT+01:00 Simon Hohberg <simon.hohberg at mcs-datalabs.com>:
> Hi Oliver,
>
> On 02/18/2016 12:10 PM, Olivier wrote:
>
> Hello,
>
> I'm trying to have my first calls with WebRTC.
> My server has asterisk 13.7.0.
>
> I'm following the instructions from the wiki [1].
> So I'm using [2] live demo from
2013 May 11
2
Javascript source client
Thomas,
Thank you for your interest in this, you description is as accurate as I
can see.
> From my perspective your challenges will be to get the containers right.
> WebM for audio+video
> Ogg for audio
>
> Also (I'm not that familiar with webRTC) you might need to reencode
> to Opus and VP8 in some cases?
here is the great news
2014 Feb 03
0
Relay/forward RTP-packets over icecast2
> What machine are you running (namely what OS)?
Debian.
> I dont understand your approach.
> Why running a 'streamer' behind a nat?
> Not enough 'resources' to rent/ rent to buy a ded. Server?
> Mean, can't expect to satisfy a lot of listeners this way. :-)
I am listening the Muazkhan indications XD:
>
> Hi Muaz Khan,
>> We are adtlantida.tv and
2013 May 11
2
Javascript source client
Hey everyone,
I am new to the dev list here, but my question specific about any
development towards webrtc integration.
Let me explain, a couple colleagues and I are currently working on our
webrtc build at live.mayfirst.org site useing nodejs. We are currently
looking into seeing if there is any development of a javascript source
client? This would be used to send the webrtc room to a public
2006 Sep 17
0
Liquidsoap 0.3.0
Hi list,
The Savonet team is proud to announce a new release of its
programmable audio stream generator, liquidsoap 0.3.0. Liquidsoap is a
simple ruby-like script language allowing one to build audio stream
sources from various elementary sources, source combinators and audio
outputs. It is mainly intended to be used as an icecast client for
internet radios through the use of the shout output
2015 Sep 15
3
Asterisk 13 WebRTC Status report
hi,
i'm fighting with webrtc for 14 days
reporting my experience to minimize number of crazy asterisk users
i have working webrtc with simpl5 + asterisk 13 + pjproject 2.4.5 +
chan_pjsip + secure websockets + secure audio + audio in both ways
problems
first, i needed run chan_sip for old hard phones and wss with chan_pjsip
only for webrtc. this is possible with patch from
2018 Dec 07
2
Question on WebRTC configuration
In the asterisk wiki instructions for Configuring Asterisk for WebRTC clients...
https://wiki.asterisk.org/wiki/display/AST/Configuring+Asterisk+for+WebRTC+Clients
"To communicate with websocket clients, Asterisk uses its built-in HTTP daemon. Configure /etc/asterisk/http.conf as follows:
[general]
enabled=yes
bindaddr=0.0.0.0
bindport=8088
tlsenable=yes
tlsbindaddr=0.0.0.0:8089
2018 Sep 11
2
Can someone provide some insight on WebRTC vs a generic SIP library in a browser?
I work on the Asterisk side of things and admit to not knowing about browser development.
A co-worker asked me today why they should develop a web based agent software using WebRTC? They prefer to develop using a SIP based javascript library they found.
Can anyone offer some insight on why to choose either WebRTC or a SIP library for a web based agent software connecting up to an asterisk
2008 Sep 07
1
schedule a fallback and / 2 sources streaming at 1 server ???
Hi,
On Sun, Sep 7, 2008 at 12:30 AM, Dick Trump <dtrump1 at triadav.com> wrote:
> kosnickx wrote:
>> But there is one thing : while falling back from one mount point to
>> another it takes about 10 to 15 seconds. Does this happen to you too? Is
>> there any way to reduce this to minimum or to 0 if possible.
I've been wanting to advise you to give liquidsoap a try
2023 Jun 24
1
Why is WebRTC treated differently from regular SIP in Asterisk
I'm learning about WebRTC clients, and am wondering why Asterisk treats them
differently from any other SIP client.
The media (RTP) should be no different, so the only difference should be on
the signaling side. I noticed that the Asterisk wiki mentions the need for
res_pjsip_transport_websocket, so does that mean Asterisk requires the
signaling to occur over a websocket?
If I used
2014 Jul 02
1
Webrtc Not acceptable here
Hi,
I am getting
*Can't provide secure audio requested in SDP offer*
with sipml5 client hosted on my local system
[1060] ; This will be WebRTC client
type=friend
username=1060 ; The Auth user for SIP.js
host=dynamic ; Allows any host to register
secret=sameer ; The SIP Password for SIP.js
encryption=yes ; Tell Asterisk to use encryption for this peer
avpf=yes ; Tell Asterisk to use AVPF
2005 Oct 02
0
Shout.delay
Hi list,
I'm the main developper of liquidsoap, the streamer from the Savonet
project (http://savonet.sf.net). This streamer mainly uses shout2 as
an output plugin, but also alsa, raw data over rtp, etc. And many
outputs can be used in the same instance of liquidsoap. For that
reason, we don't use directly sync and delay from libshout, but have a
centralized scheduler, which should avoid