webrtc technical overview and introduction
DESCRIPTION
TRANSCRIPT
0 WebRTC Tutorial 28 Nov 2012
November 27-29, 2012 South San Francisco Conference Center
WebRTC Technical Overview and Introduc7on
Alan Johnston Dis7nguished Engineer
Avaya
November 28, 2012
Dan BurneG Director of Standards
Voxeo Labs
November 28, 2012
1
WebRTC Tutorial Topics
• What is WebRTC?
• How to Use WebRTC – Peer Connec@on • WebRTC Peer-‐to-‐Peer Media
• WebRTC Protocols and IETF Standards • WebRTC W3C API Overview
• Pseudo Code Walkthrough • What’s Next?
2 WebRTC Tutorial 28 Nov 2012
Announcement!
• New book on WebRTC! – hQp://webrtcbook.com
• Available on Amazon as paperback and Kindle eBook
• Also iBooks, B&N Nook, etc.
3 WebRTC Tutorial 28 Nov 2012
What is WebRTC?
4 WebRTC Tutorial 28 Nov 2012
The Browser RTC Func@on
• New Browser Real-‐Time Communica@on (RTC) Func@on built-‐in to browsers
• Contains – Audio and video codecs – Ability to nego@ate peer-‐to-‐peer connec@ons
– Echo cancella@on, packet loss concealement
• In Chrome today, Mozilla soon, Internet Explorer and Safari eventually
HTTP or WebSockets
On-‐the-‐wire protocols
On-‐the-‐wire protocols
RTC APIs Other APIs
Na@ve OS Services
Web Server
JavaScript/HTML/CSS
Web Browser
Browser RTC
Func@on
(Signaling)
(Media or Data)
5 WebRTC Tutorial 28 Nov 2012
What’s New
6 WebRTC Tutorial 28 Nov 2012
What’s New Con@nued
7 WebRTC Tutorial 28 Nov 2012
WebRTC Support of Mul@ple Media
• Mul@ple sources of audio and video are assumed and supported
• All RTP media, voice and video, and RTCP feedback messages are mul@plexed over the same UDP port and address
8
Stereo Audio
Browser M on Mobile
Browser L on Laptop
Front Camera Video
Rear Camera Video
Microphone Audio
Applica@on Sharing Video
WebCam Video
WebRTC Tutorial 28 Nov 2012
WebRTC Triangle
• Both browsers running the same web applica@on from web server
• Peer Connec@on established between them with the help of the web server
Web Server (Applica@on)
Browser M (Running HTML5 Applica@on
from Web Server)
Browser L (Running HTML5 Applica@on
from Web Server)
Peer Connec@on (Audio, Video, and/or Data)
9 WebRTC Tutorial 28 Nov 2012
WebRTC Trapezoid
• Similar to SIP Trapezoid • Web Servers communicate using SIP or Jingle • Unclear how this really works on web
Web Server A (Applica@on A)
Peer Connec@on (Audio and/or Video)
Web Server B (Applica@on B)
SIP or Jingle
Browser M (Running HTML5 Applica@on
from Web Server A)
Browser T (Running HTML5 Applica@on
from Web Server B)
10 WebRTC Tutorial 28 Nov 2012
WebRTC and SIP
• Peer Connec@on appears as a standard RTP media session, described by SDP
• SIP Endpoint must support RTCWEB Media extensions (ICE NAT Traversal, Secure RTP, etc.)
• Also drag-‐ieh-‐sipcore-‐websocket that defines SIP transport over WebSockets
Web Server
SIP Client Peer Connec@on (Audio and/or Video)
SIP Server SIP
SIP
Browser M
11 WebRTC Tutorial 28 Nov 2012
WebRTC and PSTN
• Peer Connec@on terminates on a PSTN Gateway
• Audio Only
Web Server
Browser M PSTN Gateway Peer Connec@on (Audio)
Phone
12 WebRTC Tutorial 28 Nov 2012
How to Use WebRTC
13 WebRTC Tutorial 28 Nov 2012
How to use WebRTC
Setup Peer Connec@on
Obtain Local Media
Close Connec@on
AQach Media or Data
Add more media
All media added
Peer Connec@on established
AQach more media or data
Either browser closes the connec@on
WebRTC Tutorial 28 Nov 2012 14
How to use WebRTC
• getUserMedia() – Audio and/or video – Constraints – User permissions
• LocalMediaStream • MediaStream – Local or derived
Setup Peer Connec@on
Obtain Local Media
Close Connec@on
AQach Media or Data
Add more media
All media added
Peer Connec@on established
AQach more media or data
Either browser closes the connec@on
WebRTC Tutorial 28 Nov 2012 15
How to use WebRTC
• RTCPeerConnection – Direct media
– Between two peers – ICE processing – SDP processing – Iden@ty verifica@on – Sta@s@cs repor@ng
Setup Peer Connec@on
Obtain Local Media
Close Connec@on
AQach Media or Data
Add more media
All media added
Peer Connec@on established
AQach more media or data
Either browser closes the connec@on
WebRTC Tutorial 28 Nov 2012 16
How to use WebRTC
• addStream() – Doesn't change media state!
• removeStream() – DiQo!
• createOffer(), createAnswer()
• setLocalDescription(), setRemoteDescription()
• Applying SDP answer makes the magic happen
• createDataChannel()
Setup Peer Connec@on
Obtain Local Media
Close Connec@on
AQach Media or Data
Add more media
All media added
Peer Connec@on established
AQach media or data
Either browser closes the connec@on
WebRTC Tutorial 28 Nov 2012 17
How to use WebRTC
Close Connec@on
AQach Media or Data
Obtain Local Media
Add more media
All media added
Setup Peer Connec@on
Peer Connec@on established
AQach more media or data
Either browser closes the connec@on
WebRTC Tutorial 28 Nov 2012 18
WebRTC Peer-‐to-‐Peer Media
19 WebRTC Tutorial 28 Nov 2012
Media Flows in WebRTC
20
Web Server
Internet
Router
Browser L
Home WiFi Router
Browser T
Browser M
Coffee Shop WiFi Router
Browser D
WebRTC Tutorial 28 Nov 2012
Media without WebRTC
Web Server
Internet
Router
Browser L
Home WiFi Router
Browser T
Browser M
Coffee Shop WiFi Router
Browser D
WebRTC Tutorial 28 Nov 2012 21
Peer-‐to-‐Peer Media with WebRTC
Web Server
Internet
Router
Browser L
Home WiFi Router
Browser T
Browser M
Coffee Shop WiFi Router
Browser D
WebRTC Tutorial 28 Nov 2012 22
Web Server
Internet
Router with NAT
Browser L
Home WiFi with NAT
Browser T
Browser M
Most browsers are behind NATs on the Internet, which complicates the establishment of peer-‐to-‐peer media sessions.
Coffee Shop WiFi with
NAT
Browser D
NAT Complicates Peer-‐to-‐Peer Media
WebRTC Tutorial 28 Nov 2012 23
Web Server
Internet
Router with NAT
Browser L
Home WiFi with NAT
Browser T
Browser M
ICE hole punching can ogen establish a direct peer-‐to-‐peer session between browsers behind different NATs
Coffee Shop WiFi with
NAT
Browser D
Peer-‐to-‐Peer Media Through NAT
WebRTC Tutorial 28 Nov 2012 24
Web Server
Internet
Router with NAT
Browser L
Home WiFi with NAT
Browser T
Browser M
If both browsers are behind the same NAT, hole punching can ogen establish a connec@on that never leaves the NAT.
Coffee Shop WiFi with
NAT
Browser D
P2P Media Can Stay Local to NAT
WebRTC Tutorial 28 Nov 2012 25
Web Server
Internet
Router with NAT
Browser L
Home WiFi with NAT
203.0.113.4
Browser T
Browser M 192.168.0.5 Coffee Shop
WiFi with NAT
Browser D
TURN Server 198.51.100.2
ICE hole punching uses STUN and TURN servers in the public Internet to help with NAT traversal.
STUN Server 198.51.100.9
ICE uses STUN and TURN Servers
WebRTC Tutorial 28 Nov 2012 26
Web Server
Internet
Router with NAT
Browser L
Home WiFi with NAT
203.0.113.4
Browser T
Browser M 192.168.0.5 Coffee Shop
WiFi with NAT
Browser D
TURN Server 198.51.100.2
Browser sends STUN test packet to STUN server to learn its public IP address (address of the NAT).
STUN Server 198.51.100.9
Browser Queries STUN Server
WebRTC Tutorial 28 Nov 2012 27
Web Server
Internet
Router with NAT
Browser L
Home WiFi with NAT
Browser T
Browser M
Coffee Shop WiFi with
NAT
Browser D
TURN Server as a Media Relay
In some cases, hole punching fails, and a TURN Media Relay on the public Internet must be used.
STUN Server
TURN Server Can Relay Media
WebRTC Tutorial 28 Nov 2012 28
WebRTC Protocols and IETF Standards
29 WebRTC Tutorial 28 Nov 2012
WebRTC Protocols
30
IP
UDP SCTP
WebSocket SRTP SDP
TURN
STUN
ICE
TCP TLS Transport Layer
Network Layer
Applica@on Layer
DTLS
HTTP
WebRTC Tutorial 28 Nov 2012
Data Channel Protocols
• Data channel provides a non-‐media channel between browsers
• ICE is s@ll used for NAT Traversal • Used in gaming and other non-‐communica@on applica@ons
31
DTLS
Data Channel Data
Provides conges@on and flow control
Provides security (confiden@ality)
Provides transport through NAT (ager ICE hole punching)
Internet
UDP
DTLS
SCTP
WebRTC Tutorial 28 Nov 2012
A Joint Standards Effort
• World Wide Web Consor@um (W3C) – Standardizing APIs (Applica@on Programming Interfaces)
– Most work in WEBRTC Working Group – Used by JavaScript to access RTC func@on
• Internet Engineering Task Force (IETF) – Standardizing protocols (bits on the wire) – Peer Connec@on will use RTP, SDP, and extensions – Some work in RTCWEB Working Group – Lots of related work in MMUSIC, AVTCORE, etc.
32 WebRTC Tutorial 28 Nov 2012
IETF RTCWEB Documents
33 WebRTC Tutorial 28 Nov 2012
WebRTC Security
• Media is secured by Secure RTP • Control is secured by HTTPS (HTTP over TLS over TCP) • Browser confirms permissions for microphone and camera on each session
34
DTLS
HTML/CSS/JavaScript
Provides transport of HTML/CSS/JavaScript
Provides security (confiden@ality and authen@ca@on)
Provides reliability and conges@on control
Internet
TCP
TLS
HTTP
WebRTC Tutorial 28 Nov 2012
Codecs
35
• Mandatory to Implement (MTI) audio codecs are seQled (finally!)
• Video is not! WebRTC Tutorial 28 Nov 2012
RFC 6716 .
WebRTC W3C API Overview
36 WebRTC Tutorial 28 Nov 2012
Local Media Handling
• Channels – Encoded together – Can't manipulate individually
Sources
Presenter Video
Demonstra@on Video
Audio
Presenta@on Video
Presenta@on Stream label=“F8kdls”
Presenter Stream label=“8dFlf”
Demonstra@on Stream label=“3dfdf2”
Local Media Streams Streams Tracks
Track label=“Audio”
Track label=“Presenta@on”
Track label=“Audio”
Track label=“Presenter”
Track label=“Audio”
Track label=“Demonstra@on”
Video Stream label=“923fKs”
Front Camera Video
Rear Camera Video
Microphone Audio
Applica@on Sharing Video
Video Stream label=“js4KMs”
Video Stream label=“eR3l0s”
Audio Stream label=“2dLe3js”
Browser M
WebRTC Tutorial 28 Nov 2012 37
Local Media Handling
• Tracks (MediaStreamTrack) – Exist only as part of Streams
– Ordered and op@onally labeled
Sources
Presenter Video
Demonstra@on Video
Audio
Presenta@on Video
Presenta@on Stream label=“F8kdls”
Presenter Stream label=“8dFlf”
Demonstra@on Stream label=“3dfdf2”
Local Media Streams Streams Tracks
Track label=“Audio”
Track label=“Presenta@on”
Track label=“Audio”
Track label=“Presenter”
Track label=“Audio”
Track label=“Demonstra@on”
Video Stream label=“923fKs”
Front Camera Video
Rear Camera Video
Microphone Audio
Applica@on Sharing Video
Video Stream label=“js4KMs”
Video Stream label=“eR3l0s”
Audio Stream label=“2dLe3js”
Browser M
WebRTC Tutorial 28 Nov 2012 38
Local Media Handling
• Streams (MediaStream) – All contained tracks are synchronized – Can be created, transmiQed, etc.
Sources
Presenter Video
Demonstra@on Video
Audio
Presenta@on Video
Presenta@on Stream label=“F8kdls”
Presenter Stream label=“8dFlf”
Demonstra@on Stream label=“3dfdf2”
Local Media Streams Streams Tracks
Track label=“Audio”
Track label=“Presenta@on”
Track label=“Audio”
Track label=“Presenter”
Track label=“Audio”
Track label=“Demonstra@on”
Video Stream label=“923fKs”
Front Camera Video
Rear Camera Video
Microphone Audio
Applica@on Sharing Video
Video Stream label=“js4KMs”
Video Stream label=“eR3l0s”
Audio Stream label=“2dLe3js”
Browser M
WebRTC Tutorial 28 Nov 2012 39
Local Media Handling
• LocalMediaStream – Returned from getUserMedia() – Directly connected to source – Permission check required to obtain
Sources
Presenter Video
Demonstra@on Video
Audio
Presenta@on Video
Presenta@on Stream label=“F8kdls”
Presenter Stream label=“8dFlf”
Demonstra@on Stream label=“3dfdf2”
Local Media Streams Streams Tracks
Track label=“Audio”
Track label=“Presenta@on”
Track label=“Audio”
Track label=“Presenter”
Track label=“Audio”
Track label=“Demonstra@on”
Video Stream label=“923fKs”
Front Camera Video
Rear Camera Video
Microphone Audio
Applica@on Sharing Video
Video Stream label=“js4KMs”
Video Stream label=“eR3l0s”
Audio Stream label=“2dLe3js”
Browser M
WebRTC Tutorial 28 Nov 2012 40
Local Media Handling
• In this example – Obtained 4 local media streams – Created 3 media streams from them – Sent streams over Peer Connec@on
Sources
Presenter Video
Demonstra@on Video
Audio
Presenta@on Video
Presenta@on Stream label=“F8kdls”
Presenter Stream label=“8dFlf”
Demonstra@on Stream label=“3dfdf2”
Local Media Streams Streams Tracks
Track label=“Audio”
Track label=“Presenta@on”
Track label=“Audio”
Track label=“Presenter”
Track label=“Audio”
Track label=“Demonstra@on”
Video Stream label=“923fKs”
Front Camera Video
Rear Camera Video
Microphone Audio
Applica@on Sharing Video
Video Stream label=“js4KMs”
Video Stream label=“eR3l0s”
Audio Stream label=“2dLe3js”
Browser M
WebRTC Tutorial 28 Nov 2012 41
Transmitng media
• Signaling channel – Non-‐standard – Must exist to set up Peer Connec@on
• Peer Connec@on – Links together two peers – Add/Remove Media Streams
• addStream(), removeStream()
– Handlers for ICE or media change
– Data Channel support WebRTC Tutorial 28 Nov 2012 42
Peer Connec@on • "Links" together two peers – Via new RTCPeerConnection() – Generates Session Descrip@on offers/answers
• createOffer(), createAnswer()
– From SDP answers, ini@ates media • setLocalDescription(), setRemoteDescription()
– Offers/answers MUST be relayed by applica@on code!
– ICE candidates can also be relayed and added by app • addIceCandidate()
– Think of PC as an applica@on helper
WebRTC Tutorial 28 Nov 2012 43
Peer Connec@on
• Handlers for ICE or media change – onicecandidate, onicechange – onaddstream, onremovestream – onnegotiationneeded
– A few others
WebRTC Tutorial 28 Nov 2012 44
Peer Connec@on
• New Iden@ty func@ons – setIdentityProvider(), getIdentityAssertion
– Used to verify iden@ty via third party, e.g., Facebook Connect
• New Sta@s@cs API – getStats() – Obtain sta@s@cs, local and remote, on bytes/packets xmiQed, audio volume, etc.
– May be useful for conges@on-‐based adjustments
WebRTC Tutorial 28 Nov 2012 45
Pseudo Code Walkthrough
46 WebRTC Tutorial 28 Nov 2012
Pseudo Code
• Looks like real code, but . . . • API is s@ll in flux, so . . . • Don't expect this to work anywhere . . . • Yet
WebRTC Tutorial 28 Nov 2012 47
Back to first diagram
• Mobile browser "calls" laptop browser
• Each sends media to the other
Stereo Audio
Browser M on Mobile
Browser L on Laptop
Front Camera Video
Rear Camera Video
Microphone Audio
Applica@on Sharing Video
WebCam Video
WebRTC Tutorial 28 Nov 2012 48
Mobile browser code outline
• We will look next at each of these • . . . except for crea@ng the signaling channel
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right; Function s(sdp) {{ // stub success callback
Function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC(); attachMedia();
call();
function getMedia() {
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing) ///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"enumDirection": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream; }, e);
constraint =
{"video": {"mandatory": {"enumDirection": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream([microphone.audioTracks.item(0),
application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio";
presentation.videoTracks.item(0).label = "Presentation";
presenter =
new MediaStream([microphone.audioTracks.item(0),
front.videoTracks.item(0)]);
presenter.audioTracks.item(0).label = "Audio";
presenter.videoTracks.item(0).label = "Presenter";
demonstration = new MediaStream([microphone.audioTracks.item(0),
rear.videoTracks.item(0)]);
demonstration.audioTracks.item(0).label = "Audio";
demonstration.videoTracks.item(0).label = "Demonstration";
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, o, e);
signalingChannel.send(JSON.stringify({ "sdp": desc })); }
}
function handleIncomingStream(s) {
if (s.videoTracks.length == 1) {
av_stream = s;
show_av(av_stream);
} else if (s.audioTracks.length == 2) { stereo = s;
} else {
mono = s;
}
}
function show_av(s) {
display.src = URL.createObjectURL(s.videoTracks.item(0));
left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
var signalingChannel = createSignalingChannel();
getMedia();
createPC(); attachMedia();
call();
WebRTC Tutorial 28 Nov 2012 49
Mobile browser produces . . .
• Four calls to getUserMedia() • Three calls to new MediaStream() • App then labels all tracks and sends them
Sources
Presenter Video
Demonstra@on Video
Audio
Presenta@on Video
Presenta@on Stream label=“F8kdls”
Presenter Stream label=“8dFlf”
Demonstra@on Stream label=“3dfdf2”
Local Media Streams Streams Tracks
Track label=“Audio”
Track label=“Presenta@on”
Track label=“Audio”
Track label=“Presenter”
Track label=“Audio”
Track label=“Demonstra@on”
Video Stream label=“923fKs”
Front Camera Video
Rear Camera Video
Microphone Audio
Applica@on Sharing Video
Video Stream label=“js4KMs”
Video Stream label=“eR3l0s”
Audio Stream label=“2dLe3js”
Browser M
WebRTC Tutorial 28 Nov 2012 50
func@on getMedia()
• Get audio • (Get window video – out of scope)
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right; Function s(sdp) {{ // stub success callback
Function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC(); attachMedia();
call();
function getMedia() {
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing) ///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"enumDirection": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream; }, e);
constraint =
{"video": {"mandatory": {"enumDirection": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream([microphone.audioTracks.item(0),
application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio";
presentation.videoTracks.item(0).label = "Presentation";
presenter =
new MediaStream([microphone.audioTracks.item(0),
front.videoTracks.item(0)]);
presenter.audioTracks.item(0).label = "Audio";
presenter.videoTracks.item(0).label = "Presenter";
demonstration = new MediaStream([microphone.audioTracks.item(0),
rear.videoTracks.item(0)]);
demonstration.audioTracks.item(0).label = "Audio";
demonstration.videoTracks.item(0).label = "Demonstration";
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc })); }
}
function handleIncomingStream(s) {
if (s.videoTracks.length == 1) {
av_stream = s;
show_av(av_stream);
} else if (s.audioTracks.length == 2) { stereo = s;
} else {
mono = s;
}
}
function show_av(s) {
display.src = URL.createObjectURL(s.videoTracks.item(0));
left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
. . .
WebRTC Tutorial 28 Nov 2012 51
func@on getMedia()
• Get front-‐facing camera
• Get rear-‐facing camera
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right; Function s(sdp) {{ // stub success callback
Function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC(); attachMedia();
call();
function getMedia() {
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing) ///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"enumDirection": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream; }, e);
constraint =
{"video": {"mandatory": {"enumDirection": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream([microphone.audioTracks.item(0),
application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio";
presentation.videoTracks.item(0).label = "Presentation";
presenter =
new MediaStream([microphone.audioTracks.item(0),
front.videoTracks.item(0)]);
presenter.audioTracks.item(0).label = "Audio";
presenter.videoTracks.item(0).label = "Presenter";
demonstration = new MediaStream([microphone.audioTracks.item(0),
rear.videoTracks.item(0)]);
demonstration.audioTracks.item(0).label = "Audio";
demonstration.videoTracks.item(0).label = "Demonstration";
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
function call() {
pc.createOffer(gotDescription, s, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc })); }
}
function handleIncomingStream(s) {
if (s.videoTracks.length == 1) {
av_stream = s;
show_av(av_stream);
} else if (s.audioTracks.length == 2) { stereo = s;
} else {
mono = s;
}
}
function show_av(s) {
display.src = URL.createObjectURL(s.videoTracks.item(0));
left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
. . .
constraint =
{"video": {"mandatory": {"enumDirection": "front"}}}; navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"enumDirection": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
WebRTC Tutorial 28 Nov 2012 52
func@on createPC()
• Create RTCPeerConnec@on • Set handlers
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right; Function s(sdp) {{ // stub success callback
Function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC(); attachMedia();
call();
function getMedia() {
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing) ///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"enumDirection": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream; }, e);
constraint =
{"video": {"mandatory": {"enumDirection": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream([microphone.audioTracks.item(0),
application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio";
presentation.videoTracks.item(0).label = "Presentation";
presenter =
new MediaStream([microphone.audioTracks.item(0),
front.videoTracks.item(0)]);
presenter.audioTracks.item(0).label = "Audio";
presenter.videoTracks.item(0).label = "Presenter";
demonstration = new MediaStream([microphone.audioTracks.item(0),
rear.videoTracks.item(0)]);
demonstration.audioTracks.item(0).label = "Audio";
demonstration.videoTracks.item(0).label = "Demonstration";
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc })); }
}
function handleIncomingStream(s) {
if (s.videoTracks.length == 1) {
av_stream = s;
show_av(av_stream);
} else if (s.audioTracks.length == 2) { stereo = s;
} else {
mono = s;
}
}
function show_av(s) {
display.src = URL.createObjectURL(s.videoTracks.item(0));
left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2", "credential":"myPassword"}]};
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
WebRTC Tutorial 28 Nov 2012 53
Mobile browser consumes . . .
• Receives three media streams
• Chooses one • Sends tracks to output channels
Leg Headphone
Display
Right Headphone
Stereo Stream label=“839dg”
Mono Stream label=“dk38djs”
Audio & Video Stream label=“wlQ3kdds”
Sinks Media Streams
Track label=“Right”
Track label=“Leg”
Track label=“Mono”
Track label=“Video”
Track label=“Right”
Track label=“Leg”
Tracks
(Audio & Video Stream selected) Browser M
WebRTC Tutorial 28 Nov 2012 54
Func@on handleIncomingStream()
• If incoming stream has video track, set to av_stream and display it
• If it has two audio tracks, must be stereo • Otherwise, must be the mono stream
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right; Function s(sdp) {{ // stub success callback
Function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC(); attachMedia();
call();
function getMedia() {
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing) ///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"enumDirection": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream; }, e);
constraint =
{"video": {"mandatory": {"enumDirection": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream([microphone.audioTracks.item(0),
application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio";
presentation.videoTracks.item(0).label = "Presentation";
presenter =
new MediaStream([microphone.audioTracks.item(0),
front.videoTracks.item(0)]);
presenter.audioTracks.item(0).label = "Audio";
presenter.videoTracks.item(0).label = "Presenter";
demonstration = new MediaStream([microphone.audioTracks.item(0),
rear.videoTracks.item(0)]);
demonstration.audioTracks.item(0).label = "Audio";
demonstration.videoTracks.item(0).label = "Demonstration";
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc })); }
}
function handleIncomingStream(s) {
if (s.videoTracks.length == 1) {
av_stream = s;
show_av(av_stream);
} else if (s.audioTracks.length == 2) { stereo = s;
} else {
mono = s;
}
}
function show_av(s) {
display.src = URL.createObjectURL(s.videoTracks.item(0));
left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
if (s.videoTracks.length == 1) {
av_stream = s;
show_av(av_stream); } else if (s.audioTracks.length == 2) {
stereo = s;
} else {
mono = s;
}
WebRTC Tutorial 28 Nov 2012 55
func@on aQachMedia()
• Create new presenta@on & presenter streams
• Label the tracks in the new streams
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right; Function s(sdp) {{ // stub success callback
Function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC(); attachMedia();
call();
function getMedia() {
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing) ///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"enumDirection": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream; }, e);
constraint =
{"video": {"mandatory": {"enumDirection": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream([microphone.audioTracks.item(0),
application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio";
presentation.videoTracks.item(0).label = "Presentation";
presenter =
new MediaStream([microphone.audioTracks.item(0),
front.videoTracks.item(0)]);
presenter.audioTracks.item(0).label = "Audio";
presenter.videoTracks.item(0).label = "Presenter";
demonstration = new MediaStream([microphone.audioTracks.item(0),
rear.videoTracks.item(0)]);
demonstration.audioTracks.item(0).label = "Audio";
demonstration.videoTracks.item(0).label = "Demonstration";
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc })); }
}
function handleIncomingStream(s) {
if (s.videoTracks.length == 1) {
av_stream = s;
show_av(av_stream);
} else if (s.audioTracks.length == 2) { stereo = s;
} else {
mono = s;
}
}
function show_av(s) {
display.src = URL.createObjectURL(s.videoTracks.item(0));
left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
presentation =
new MediaStream([microphone.audioTracks.item(0),
application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio";
presentation.videoTracks.item(0).label = "Presentation";
presenter =
new MediaStream([microphone.audioTracks.item(0),
front.videoTracks.item(0)]);
presenter.audioTracks.item(0).label = "Audio";
presenter.videoTracks.item(0).label = "Presenter";
. . .
WebRTC Tutorial 28 Nov 2012 56
func@on aQachMedia()
• Create new demonstra@on stream
• AQach all 3 streams to Peer Connec@on
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right; Function s(sdp) {{ // stub success callback
Function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC(); attachMedia();
call();
function getMedia() {
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing) ///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"enumDirection": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream; }, e);
constraint =
{"video": {"mandatory": {"enumDirection": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream([microphone.audioTracks.item(0),
application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio";
presentation.videoTracks.item(0).label = "Presentation";
presenter =
new MediaStream([microphone.audioTracks.item(0),
front.videoTracks.item(0)]);
presenter.audioTracks.item(0).label = "Audio";
presenter.videoTracks.item(0).label = "Presenter";
demonstration = new MediaStream([microphone.audioTracks.item(0),
rear.videoTracks.item(0)]);
demonstration.audioTracks.item(0).label = "Audio";
demonstration.videoTracks.item(0).label = "Demonstration";
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc })); }
}
function handleIncomingStream(s) {
if (s.videoTracks.length == 1) {
av_stream = s;
show_av(av_stream);
} else if (s.audioTracks.length == 2) { stereo = s;
} else {
mono = s;
}
}
function show_av(s) {
display.src = URL.createObjectURL(s.videoTracks.item(0));
left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
demonstration =
new MediaStream([microphone.audioTracks.item(0),
rear.videoTracks.item(0)]); demonstration.audioTracks.item(0).label = "Audio";
demonstration.videoTracks.item(0).label = "Demonstration";
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
WebRTC Tutorial 28 Nov 2012 57
func@on call()
• Ask browser to create SDP offer • Set offer as local descrip@on • Send offer to peer
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right; Function s(sdp) {{ // stub success callback
Function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC(); attachMedia();
call();
function getMedia() {
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing) ///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"enumDirection": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream; }, e);
constraint =
{"video": {"mandatory": {"enumDirection": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream([microphone.audioTracks.item(0),
application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio";
presentation.videoTracks.item(0).label = "Presentation";
presenter =
new MediaStream([microphone.audioTracks.item(0),
front.videoTracks.item(0)]);
presenter.audioTracks.item(0).label = "Audio";
presenter.videoTracks.item(0).label = "Presenter";
demonstration = new MediaStream([microphone.audioTracks.item(0),
rear.videoTracks.item(0)]);
demonstration.audioTracks.item(0).label = "Audio";
demonstration.videoTracks.item(0).label = "Demonstration";
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc })); }
}
function handleIncomingStream(s) {
if (s.videoTracks.length == 1) {
av_stream = s;
show_av(av_stream);
} else if (s.audioTracks.length == 2) { stereo = s;
} else {
mono = s;
}
}
function show_av(s) {
display.src = URL.createObjectURL(s.videoTracks.item(0));
left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
pc.createOffer(gotDescription, e);
function gotDescription(desc) { pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
WebRTC Tutorial 28 Nov 2012 58
Func@on show_av(s)
• Turn streams into URLs
• Set as source for media elements
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right; Function s(sdp) {{ // stub success callback
Function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC(); attachMedia();
call();
function getMedia() {
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing) ///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"enumDirection": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream; }, e);
constraint =
{"video": {"mandatory": {"enumDirection": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream([microphone.audioTracks.item(0),
application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio";
presentation.videoTracks.item(0).label = "Presentation";
presenter =
new MediaStream([microphone.audioTracks.item(0),
front.videoTracks.item(0)]);
presenter.audioTracks.item(0).label = "Audio";
presenter.videoTracks.item(0).label = "Presenter";
demonstration = new MediaStream([microphone.audioTracks.item(0),
rear.videoTracks.item(0)]);
demonstration.audioTracks.item(0).label = "Audio";
demonstration.videoTracks.item(0).label = "Demonstration";
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc })); }
}
function handleIncomingStream(s) {
if (s.videoTracks.length == 1) {
av_stream = s;
show_av(av_stream);
} else if (s.audioTracks.length == 2) { stereo = s;
} else {
mono = s;
}
}
function show_av(s) {
display.src = URL.createObjectURL(s.videoTracks.item(0));
left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
display.src = URL.createObjectURL(s.videoTracks.item(0));
left.src = URL.createObjectURL(s.audioTracks.item(0));
right.src = URL.createObjectURL(s.audioTracks.item(1));
WebRTC Tutorial 28 Nov 2012 59
How do we get the SDP answer?
• Magic signaling channel provides message
• If SDP, set as remote descrip@on • If ICE candidate, tell the browser
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right; Function s(sdp) {{ // stub success callback
Function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC(); attachMedia();
call();
function getMedia() {
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing) ///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"enumDirection": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream; }, e);
constraint =
{"video": {"mandatory": {"enumDirection": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream([microphone.audioTracks.item(0),
application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio";
presentation.videoTracks.item(0).label = "Presentation";
presenter =
new MediaStream([microphone.audioTracks.item(0),
front.videoTracks.item(0)]);
presenter.audioTracks.item(0).label = "Audio";
presenter.videoTracks.item(0).label = "Presenter";
demonstration = new MediaStream([microphone.audioTracks.item(0),
rear.videoTracks.item(0)]);
demonstration.audioTracks.item(0).label = "Audio";
demonstration.videoTracks.item(0).label = "Demonstration";
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc })); }
}
function handleIncomingStream(s) {
if (s.videoTracks.length == 1) {
av_stream = s;
show_av(av_stream);
} else if (s.audioTracks.length == 2) { stereo = s;
} else {
mono = s;
}
}
function show_av(s) {
display.src = URL.createObjectURL(s.videoTracks.item(0));
left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
WebRTC Tutorial 28 Nov 2012 60
And now the laptop browser . . .
• Watch for the following – We set up media *ager* receiving the offer
– but the signaling channel s@ll must exist first!
WebRTC Tutorial 28 Nov 2012 61
Signaling channel message is trigger
• Set up PC and media if not already done
• If SDP, *also* answer
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var webcam, left, right;
var av, stereo, mono;
var presentation, presenter, demonstration;
var speaker, win1, win2, win3; Function s(sdp) {{ // stub success callback
Function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();
attachMedia();
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint = {"audio": {"mandatory": {"enumDirection": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"enumDirection": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream; }, e);
}
function attachMedia() {
av = new MediaStream([webcam.videoTracks.item(0),
left.audioTracks.item(0),
right.audioTracks.item(0)]);
av.videoTracks.item(0).label = "Video"; av.audioTracks.item(0).label = "Left";
av.audioTracks.item(1).label = "Right";
stereo = new MediaStream([left.audioTracks.item(0),
right.audioTracks.item(0)]);
stereo.audioTracks.item(0).label = "Left";
stereo.audioTracks.item(1).label = "Right";
mono = left;
mono.audioTracks.item(0).label = "Left";
pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(s) { if (s.videoTracks.item(0).label == "Presentation") {
speaker.src = URL.createObjectURL(s.audioTracks.item(0));
win1.src = URL.createObjectURL(s.videoTracks.item(0));
} else if (s.videoTracks.item(0).label == "Presenter") {
win2.src = URL.createObjectURL(s.videoTracks.item(0));
} else {
win3.src = URL.createObjectURL(s.videoTracks.item(0));
} }
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
answer();
} else {
pc.addIceCandidate(new RTCIceCandidate(signal.candidate));
}
};
signalingChannel.onmessage = function (msg) {
if (!pc) { prepareForIncomingCall(); } var s = JSON.parse(msg.data);
if (s.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(s.sdp), s, e);
answer(); } else {
pc.addIceCandidate(new RTCIceCandidate(s.candidate));
}};
WebRTC Tutorial 28 Nov 2012 62
Func@on prepareForIncomingCall()
• No suprises here • Media obtained is a liQle different • But aQached the same way
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var webcam, left, right;
var av, stereo, mono;
var presentation, presenter, demonstration;
var speaker, win1, win2, win3; Function s(sdp) {{ // stub success callback
Function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();
attachMedia();
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint = {"audio": {"mandatory": {"enumDirection": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"enumDirection": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream; }, e);
}
function attachMedia() {
av = new MediaStream([webcam.videoTracks.item(0),
left.audioTracks.item(0),
right.audioTracks.item(0)]);
av.videoTracks.item(0).label = "Video"; av.audioTracks.item(0).label = "Left";
av.audioTracks.item(1).label = "Right";
stereo = new MediaStream([left.audioTracks.item(0),
right.audioTracks.item(0)]);
stereo.audioTracks.item(0).label = "Left";
stereo.audioTracks.item(1).label = "Right";
mono = left;
mono.audioTracks.item(0).label = "Left";
pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(s) { if (s.videoTracks.item(0).label == "Presentation") {
speaker.src = URL.createObjectURL(s.audioTracks.item(0));
win1.src = URL.createObjectURL(s.videoTracks.item(0));
} else if (s.videoTracks.item(0).label == "Presenter") {
win2.src = URL.createObjectURL(s.videoTracks.item(0));
} else {
win3.src = URL.createObjectURL(s.videoTracks.item(0));
} }
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
answer();
} else {
pc.addIceCandidate(new RTCIceCandidate(signal.candidate));
}
};
createPC();
getMedia();
attachMedia();
WebRTC Tutorial 28 Nov 2012 63
Func@on answer()
• createAnswer() automa@cally uses value of remoteDescription when genera@ng new SDP
var pc; var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var webcam, left, right;
var av, stereo, mono;
var presentation, presenter, demonstration;
var speaker, win1, win2, win3; Function s(sdp) {{ // stub success callback
Function e(error) {} // stub error callback
var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();
attachMedia();
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) { signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"enumDirection": "left"}}}; navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"enumDirection": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e); }
function attachMedia() {
av = new MediaStream([webcam.videoTracks.item(0),
left.audioTracks.item(0),
right.audioTracks.item(0)]);
av.videoTracks.item(0).label = "Video";
av.audioTracks.item(0).label = "Left"; av.audioTracks.item(1).label = "Right";
stereo = new MediaStream([left.audioTracks.item(0),
right.audioTracks.item(0)]);
stereo.audioTracks.item(0).label = "Left";
stereo.audioTracks.item(1).label = "Right";
mono = left;
mono.audioTracks.item(0).label = "Left";
pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) { pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(s) {
if (s.videoTracks.item(0).label == "Presentation") { speaker.src = URL.createObjectURL(s.audioTracks.item(0));
win1.src = URL.createObjectURL(s.videoTracks.item(0));
} else if (s.videoTracks.item(0).label == "Presenter") {
win2.src = URL.createObjectURL(s.videoTracks.item(0));
} else {
win3.src = URL.createObjectURL(s.videoTracks.item(0));
}
}
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var signal = JSON.parse(msg.data);
if (signal.sdp) { pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
answer();
} else {
pc.addIceCandidate(new RTCIceCandidate(signal.candidate));
}
};
pc.createAnswer(gotDescription, e);
function gotDescription(desc) { pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
WebRTC Tutorial 28 Nov 2012 64
In real code . . .
• The error callbacks must do something useful
• Methods with callbacks are asynchronous! – May want to wait for callbacks before con@nuing
– Consider using Stra@fiedJS, or some other JS async toolbox
– createOffer(), createAnswer(), setLocalDescrip@on(), setRemoteDescrip@on() now queued
• May want to use Iden@ty provider
65 WebRTC Tutorial 28 Nov 2012
What’s Next?
• W3C and IETF standards s@ll need to be finalized, including SDP use/interpreta@on
• Browsers need to add support – Chrome browser has much of this func@onality now, M23 without flag
– Firefox will have shortly (in nightly builds) • Interworking with SIP and Jingle need to be finalized
66 WebRTC Tutorial 28 Nov 2012
Ques@ons?
hQp://webrtcbook.com
67 WebRTC Tutorial 28 Nov 2012
November 27-29, 2012 South San Francisco Conference Center
Thank You
68