博客
关于我
强烈建议你试试无所不能的chatGPT,快点击我
WebRTC开发基础(WebRTC入门系列2:RTCPeerConnection)
阅读量:5965 次
发布时间:2019-06-19

本文共 69665 字,大约阅读时间需要 232 分钟。

RTCPeerConnection的作用是在浏览器之间建立数据的“点对点”(peer to peer)通信.

 

使用WebRTC的编解码器和协议做了大量的工作,方便了开发者,使实时通信成为可能,甚至在不可靠的网络,

比如这些如果在voip体系下开发工作量将非常大,而用webRTC的js开发者则不用考虑这些,举几个例子:

  • 丢包隐藏
  • 回声抵消
  • 带宽自适应
  • 动态抖动缓冲
  • 自动增益控制
  • 噪声抑制与抑制
  • 图像清洗

 

不同客户端之间的音频/视频传递,是不用通过服务器的。但是,两个客户端之间建立信令联系,需要通过服务器。这个和XMPP的Jingle会话很类似。

服务器主要转递两种数据。

  • 通信内容的元数据:打开/关闭对话(session)的命令、媒体文件的元数据(编码格式、媒体类型和带宽)等。
  • 网络通信的元数据:IP地址、NAT网络地址翻译和防火墙等。

WebRTC协议没有规定与服务器的信令通信方式,因此可以采用各种方式,比如WebSocket。通过服务器,两个客户端按照Session Description Protocol(SDP协议)交换双方的元数据。

 

本地和远端通讯的过程有些像电话,比如张三正在试着打电话给李四,详细机制:

  1. 张三创造了一个RTCPeerConnection 对象。
  2. 张三通过RTCPeerConnection createOffer()方法创造了一个offer(SDP会话描述) 。
  3. 张三通过他创建的offer调用setLocalDescription(),保存本地会话描述。
  4. 张三发送信令给李四。
  5. 李四接通带有李四offer的电话,调用setRemoteDescription() ,李四的RTCPeerConnection知道张三的设置(张三的本地描述到了李四这里,就成了李四的远程会话描述)。
  6. 李四调用createAnswer(),将李四的本地会话描述(local session description)成功回调。
  7. 李四调用setLocalDescription()设置他自己的本地局部描述。
  8. 李四发送应答信令answer给张三。
  9. 张三将李四的应答answer用setRemoteDescription()保存为远程会话描述(李四的remote session description)。

SDP详解:

bitbucket的一步一步实验,动手代码

WebSocket简介(用于信令通信)

信令服务或开源webRTC框架

如果您一行代码都不想写,可以看看 ,  and .

 

这里有一个单页应用程序。本地和远程的视频在一个网页,RTCPeerConnection objects 直接交换数据和消息。

HTML代码:

......  

WebRTC samples Peer connection

这里是主要视图:

<video id="localVideo" autoplay muted></video>

<video id="remoteVideo" autoplay></video>

<div>

<button id="startButton">Start</button>
<button id="callButton">Call</button>
<button id="hangupButton">Hang Up</button>
</div>

 

适配器js代码:

/* *  Copyright (c) 2015 The WebRTC project authors. All Rights Reserved. * *  Use of this source code is governed by a BSD-style license *  that can be found in the LICENSE file in the root of the source *  tree. *//* More information about these options at jshint.com/docs/options *//* jshint browser: true, camelcase: true, curly: true, devel: true,   eqeqeq: true, forin: false, globalstrict: true, node: true,   quotmark: single, undef: true, unused: strict *//* global mozRTCIceCandidate, mozRTCPeerConnection, Promise,mozRTCSessionDescription, webkitRTCPeerConnection, MediaStreamTrack *//* exported trace,requestUserMedia */'use strict';var getUserMedia = null;var attachMediaStream = null;var reattachMediaStream = null;var webrtcDetectedBrowser = null;var webrtcDetectedVersion = null;var webrtcMinimumVersion = null;var webrtcUtils = {  log: function() {    // suppress console.log output when being included as a module.    if (typeof module !== 'undefined' ||        typeof require === 'function' && typeof define === 'function') {      return;    }    console.log.apply(console, arguments);  },  extractVersion: function(uastring, expr, pos) {    var match = uastring.match(expr);    return match && match.length >= pos && parseInt(match[pos]);  }};function trace(text) {  // This function is used for logging.  if (text[text.length - 1] === '\n') {    text = text.substring(0, text.length - 1);  }  if (window.performance) {    var now = (window.performance.now() / 1000).toFixed(3);    webrtcUtils.log(now + ': ' + text);  } else {    webrtcUtils.log(text);  }}if (typeof window === 'object') {  if (window.HTMLMediaElement &&    !('srcObject' in window.HTMLMediaElement.prototype)) {    // Shim the srcObject property, once, when HTMLMediaElement is found.    Object.defineProperty(window.HTMLMediaElement.prototype, 'srcObject', {      get: function() {        // If prefixed srcObject property exists, return it.        // Otherwise use the shimmed property, _srcObject        return 'mozSrcObject' in this ? this.mozSrcObject : this._srcObject;      },      set: function(stream) {        if ('mozSrcObject' in this) {          this.mozSrcObject = stream;        } else {          // Use _srcObject as a private property for this shim          this._srcObject = stream;          // TODO: revokeObjectUrl(this.src) when !stream to release resources?          this.src = URL.createObjectURL(stream);        }      }    });  }  // Proxy existing globals  getUserMedia = window.navigator && window.navigator.getUserMedia;}// Attach a media stream to an element.attachMediaStream = function(element, stream) {  element.srcObject = stream;};reattachMediaStream = function(to, from) {  to.srcObject = from.srcObject;};if (typeof window === 'undefined' || !window.navigator) {  webrtcUtils.log('This does not appear to be a browser');  webrtcDetectedBrowser = 'not a browser';} else if (navigator.mozGetUserMedia && window.mozRTCPeerConnection) {  webrtcUtils.log('This appears to be Firefox');  webrtcDetectedBrowser = 'firefox';  // the detected firefox version.  webrtcDetectedVersion = webrtcUtils.extractVersion(navigator.userAgent,      /Firefox\/([0-9]+)\./, 1);  // the minimum firefox version still supported by adapter.  webrtcMinimumVersion = 31;  // The RTCPeerConnection object.  window.RTCPeerConnection = function(pcConfig, pcConstraints) {    if (webrtcDetectedVersion < 38) {      // .urls is not supported in FF < 38.      // create RTCIceServers with a single url.      if (pcConfig && pcConfig.iceServers) {        var newIceServers = [];        for (var i = 0; i < pcConfig.iceServers.length; i++) {          var server = pcConfig.iceServers[i];          if (server.hasOwnProperty('urls')) {            for (var j = 0; j < server.urls.length; j++) {              var newServer = {                url: server.urls[j]              };              if (server.urls[j].indexOf('turn') === 0) {                newServer.username = server.username;                newServer.credential = server.credential;              }              newIceServers.push(newServer);            }          } else {            newIceServers.push(pcConfig.iceServers[i]);          }        }        pcConfig.iceServers = newIceServers;      }    }    return new mozRTCPeerConnection(pcConfig, pcConstraints); // jscs:ignore requireCapitalizedConstructors  };  // The RTCSessionDescription object.  if (!window.RTCSessionDescription) {    window.RTCSessionDescription = mozRTCSessionDescription;  }  // The RTCIceCandidate object.  if (!window.RTCIceCandidate) {    window.RTCIceCandidate = mozRTCIceCandidate;  }  // getUserMedia constraints shim.  getUserMedia = function(constraints, onSuccess, onError) {    var constraintsToFF37 = function(c) {      if (typeof c !== 'object' || c.require) {        return c;      }      var require = [];      Object.keys(c).forEach(function(key) {        if (key === 'require' || key === 'advanced' || key === 'mediaSource') {          return;        }        var r = c[key] = (typeof c[key] === 'object') ?            c[key] : {ideal: c[key]};        if (r.min !== undefined ||            r.max !== undefined || r.exact !== undefined) {          require.push(key);        }        if (r.exact !== undefined) {          if (typeof r.exact === 'number') {            r.min = r.max = r.exact;          } else {            c[key] = r.exact;          }          delete r.exact;        }        if (r.ideal !== undefined) {          c.advanced = c.advanced || [];          var oc = {};          if (typeof r.ideal === 'number') {            oc[key] = {min: r.ideal, max: r.ideal};          } else {            oc[key] = r.ideal;          }          c.advanced.push(oc);          delete r.ideal;          if (!Object.keys(r).length) {            delete c[key];          }        }      });      if (require.length) {        c.require = require;      }      return c;    };    if (webrtcDetectedVersion < 38) {      webrtcUtils.log('spec: ' + JSON.stringify(constraints));      if (constraints.audio) {        constraints.audio = constraintsToFF37(constraints.audio);      }      if (constraints.video) {        constraints.video = constraintsToFF37(constraints.video);      }      webrtcUtils.log('ff37: ' + JSON.stringify(constraints));    }    return navigator.mozGetUserMedia(constraints, onSuccess, onError);  };  navigator.getUserMedia = getUserMedia;  // Shim for mediaDevices on older versions.  if (!navigator.mediaDevices) {    navigator.mediaDevices = {getUserMedia: requestUserMedia,      addEventListener: function() { },      removeEventListener: function() { }    };  }  navigator.mediaDevices.enumerateDevices =      navigator.mediaDevices.enumerateDevices || function() {    return new Promise(function(resolve) {      var infos = [        {kind: 'audioinput', deviceId: 'default', label: '', groupId: ''},        {kind: 'videoinput', deviceId: 'default', label: '', groupId: ''}      ];      resolve(infos);    });  };  if (webrtcDetectedVersion < 41) {    // Work around http://bugzil.la/1169665    var orgEnumerateDevices =        navigator.mediaDevices.enumerateDevices.bind(navigator.mediaDevices);    navigator.mediaDevices.enumerateDevices = function() {      return orgEnumerateDevices().then(undefined, function(e) {        if (e.name === 'NotFoundError') {          return [];        }        throw e;      });    };  }} else if (navigator.webkitGetUserMedia && window.webkitRTCPeerConnection) {  webrtcUtils.log('This appears to be Chrome');  webrtcDetectedBrowser = 'chrome';  // the detected chrome version.  webrtcDetectedVersion = webrtcUtils.extractVersion(navigator.userAgent,      /Chrom(e|ium)\/([0-9]+)\./, 2);  // the minimum chrome version still supported by adapter.  webrtcMinimumVersion = 38;  // The RTCPeerConnection object.  window.RTCPeerConnection = function(pcConfig, pcConstraints) {    // Translate iceTransportPolicy to iceTransports,    // see https://code.google.com/p/webrtc/issues/detail?id=4869    if (pcConfig && pcConfig.iceTransportPolicy) {      pcConfig.iceTransports = pcConfig.iceTransportPolicy;    }    var pc = new webkitRTCPeerConnection(pcConfig, pcConstraints); // jscs:ignore requireCapitalizedConstructors    var origGetStats = pc.getStats.bind(pc);    pc.getStats = function(selector, successCallback, errorCallback) { // jshint ignore: line      var self = this;      var args = arguments;      // If selector is a function then we are in the old style stats so just      // pass back the original getStats format to avoid breaking old users.      if (arguments.length > 0 && typeof selector === 'function') {        return origGetStats(selector, successCallback);      }      var fixChromeStats = function(response) {        var standardReport = {};        var reports = response.result();        reports.forEach(function(report) {          var standardStats = {            id: report.id,            timestamp: report.timestamp,            type: report.type          };          report.names().forEach(function(name) {            standardStats[name] = report.stat(name);          });          standardReport[standardStats.id] = standardStats;        });        return standardReport;      };      if (arguments.length >= 2) {        var successCallbackWrapper = function(response) {          args[1](fixChromeStats(response));        };        return origGetStats.apply(this, [successCallbackWrapper, arguments[0]]);      }      // promise-support      return new Promise(function(resolve, reject) {        if (args.length === 1 && selector === null) {          origGetStats.apply(self, [              function(response) {                resolve.apply(null, [fixChromeStats(response)]);              }, reject]);        } else {          origGetStats.apply(self, [resolve, reject]);        }      });    };    return pc;  };  // add promise support  ['createOffer', 'createAnswer'].forEach(function(method) {    var nativeMethod = webkitRTCPeerConnection.prototype[method];    webkitRTCPeerConnection.prototype[method] = function() {      var self = this;      if (arguments.length < 1 || (arguments.length === 1 &&          typeof(arguments[0]) === 'object')) {        var opts = arguments.length === 1 ? arguments[0] : undefined;        return new Promise(function(resolve, reject) {          nativeMethod.apply(self, [resolve, reject, opts]);        });      } else {        return nativeMethod.apply(this, arguments);      }    };  });  ['setLocalDescription', 'setRemoteDescription',      'addIceCandidate'].forEach(function(method) {    var nativeMethod = webkitRTCPeerConnection.prototype[method];    webkitRTCPeerConnection.prototype[method] = function() {      var args = arguments;      var self = this;      return new Promise(function(resolve, reject) {        nativeMethod.apply(self, [args[0],            function() {              resolve();              if (args.length >= 2) {                args[1].apply(null, []);              }            },            function(err) {              reject(err);              if (args.length >= 3) {                args[2].apply(null, [err]);              }            }]          );      });    };  });  // getUserMedia constraints shim.  var constraintsToChrome = function(c) {    if (typeof c !== 'object' || c.mandatory || c.optional) {      return c;    }    var cc = {};    Object.keys(c).forEach(function(key) {      if (key === 'require' || key === 'advanced' || key === 'mediaSource') {        return;      }      var r = (typeof c[key] === 'object') ? c[key] : {ideal: c[key]};      if (r.exact !== undefined && typeof r.exact === 'number') {        r.min = r.max = r.exact;      }      var oldname = function(prefix, name) {        if (prefix) {          return prefix + name.charAt(0).toUpperCase() + name.slice(1);        }        return (name === 'deviceId') ? 'sourceId' : name;      };      if (r.ideal !== undefined) {        cc.optional = cc.optional || [];        var oc = {};        if (typeof r.ideal === 'number') {          oc[oldname('min', key)] = r.ideal;          cc.optional.push(oc);          oc = {};          oc[oldname('max', key)] = r.ideal;          cc.optional.push(oc);        } else {          oc[oldname('', key)] = r.ideal;          cc.optional.push(oc);        }      }      if (r.exact !== undefined && typeof r.exact !== 'number') {        cc.mandatory = cc.mandatory || {};        cc.mandatory[oldname('', key)] = r.exact;      } else {        ['min', 'max'].forEach(function(mix) {          if (r[mix] !== undefined) {            cc.mandatory = cc.mandatory || {};            cc.mandatory[oldname(mix, key)] = r[mix];          }        });      }    });    if (c.advanced) {      cc.optional = (cc.optional || []).concat(c.advanced);    }    return cc;  };  getUserMedia = function(constraints, onSuccess, onError) {    if (constraints.audio) {      constraints.audio = constraintsToChrome(constraints.audio);    }    if (constraints.video) {      constraints.video = constraintsToChrome(constraints.video);    }    webrtcUtils.log('chrome: ' + JSON.stringify(constraints));    return navigator.webkitGetUserMedia(constraints, onSuccess, onError);  };  navigator.getUserMedia = getUserMedia;  if (!navigator.mediaDevices) {    navigator.mediaDevices = {getUserMedia: requestUserMedia,                              enumerateDevices: function() {      return new Promise(function(resolve) {        var kinds = {audio: 'audioinput', video: 'videoinput'};        return MediaStreamTrack.getSources(function(devices) {          resolve(devices.map(function(device) {            return {label: device.label,                    kind: kinds[device.kind],                    deviceId: device.id,                    groupId: ''};          }));        });      });    }};  }  // A shim for getUserMedia method on the mediaDevices object.  // TODO(KaptenJansson) remove once implemented in Chrome stable.  if (!navigator.mediaDevices.getUserMedia) {    navigator.mediaDevices.getUserMedia = function(constraints) {      return requestUserMedia(constraints);    };  } else {    // Even though Chrome 45 has navigator.mediaDevices and a getUserMedia    // function which returns a Promise, it does not accept spec-style    // constraints.    var origGetUserMedia = navigator.mediaDevices.getUserMedia.        bind(navigator.mediaDevices);    navigator.mediaDevices.getUserMedia = function(c) {      webrtcUtils.log('spec:   ' + JSON.stringify(c)); // whitespace for alignment      c.audio = constraintsToChrome(c.audio);      c.video = constraintsToChrome(c.video);      webrtcUtils.log('chrome: ' + JSON.stringify(c));      return origGetUserMedia(c);    };  }  // Dummy devicechange event methods.  // TODO(KaptenJansson) remove once implemented in Chrome stable.  if (typeof navigator.mediaDevices.addEventListener === 'undefined') {    navigator.mediaDevices.addEventListener = function() {      webrtcUtils.log('Dummy mediaDevices.addEventListener called.');    };  }  if (typeof navigator.mediaDevices.removeEventListener === 'undefined') {    navigator.mediaDevices.removeEventListener = function() {      webrtcUtils.log('Dummy mediaDevices.removeEventListener called.');    };  }  // Attach a media stream to an element.  attachMediaStream = function(element, stream) {    if (webrtcDetectedVersion >= 43) {      element.srcObject = stream;    } else if (typeof element.src !== 'undefined') {      element.src = URL.createObjectURL(stream);    } else {      webrtcUtils.log('Error attaching stream to element.');    }  };  reattachMediaStream = function(to, from) {    if (webrtcDetectedVersion >= 43) {      to.srcObject = from.srcObject;    } else {      to.src = from.src;    }  };} else if (navigator.mediaDevices && navigator.userAgent.match(    /Edge\/(\d+).(\d+)$/)) {  webrtcUtils.log('This appears to be Edge');  webrtcDetectedBrowser = 'edge';  webrtcDetectedVersion = webrtcUtils.extractVersion(navigator.userAgent,      /Edge\/(\d+).(\d+)$/, 2);  // the minimum version still supported by adapter.  webrtcMinimumVersion = 12;  if (RTCIceGatherer) {    window.RTCIceCandidate = function(args) {      return args;    };    window.RTCSessionDescription = function(args) {      return args;    };    window.RTCPeerConnection = function(config) {      var self = this;      this.onicecandidate = null;      this.onaddstream = null;      this.onremovestream = null;      this.onsignalingstatechange = null;      this.oniceconnectionstatechange = null;      this.onnegotiationneeded = null;      this.ondatachannel = null;      this.localStreams = [];      this.remoteStreams = [];      this.getLocalStreams = function() { return self.localStreams; };      this.getRemoteStreams = function() { return self.remoteStreams; };      this.localDescription = new RTCSessionDescription({        type: '',        sdp: ''      });      this.remoteDescription = new RTCSessionDescription({        type: '',        sdp: ''      });      this.signalingState = 'stable';      this.iceConnectionState = 'new';      this.iceOptions = {        gatherPolicy: 'all',        iceServers: []      };      if (config && config.iceTransportPolicy) {        switch (config.iceTransportPolicy) {        case 'all':        case 'relay':          this.iceOptions.gatherPolicy = config.iceTransportPolicy;          break;        case 'none':          // FIXME: remove once implementation and spec have added this.          throw new TypeError('iceTransportPolicy "none" not supported');        }      }      if (config && config.iceServers) {        this.iceOptions.iceServers = config.iceServers;      }      // per-track iceGathers etc      this.mLines = [];      this._iceCandidates = [];      this._peerConnectionId = 'PC_' + Math.floor(Math.random() * 65536);      // FIXME: Should be generated according to spec (guid?)      // and be the same for all PCs from the same JS      this._cname = Math.random().toString(36).substr(2, 10);    };    window.RTCPeerConnection.prototype.addStream = function(stream) {      // clone just in case we're working in a local demo      // FIXME: seems to be fixed      this.localStreams.push(stream.clone());      // FIXME: maybe trigger negotiationneeded?    };    window.RTCPeerConnection.prototype.removeStream = function(stream) {      var idx = this.localStreams.indexOf(stream);      if (idx > -1) {        this.localStreams.splice(idx, 1);      }      // FIXME: maybe trigger negotiationneeded?    };    // SDP helper from sdp-jingle-json with modifications.    window.RTCPeerConnection.prototype._toCandidateJSON = function(line) {      var parts;      if (line.indexOf('a=candidate:') === 0) {        parts = line.substring(12).split(' ');      } else { // no a=candidate        parts = line.substring(10).split(' ');      }      var candidate = {        foundation: parts[0],        component: parts[1],        protocol: parts[2].toLowerCase(),        priority: parseInt(parts[3], 10),        ip: parts[4],        port: parseInt(parts[5], 10),        // skip parts[6] == 'typ'        type: parts[7]        //generation: '0'      };      for (var i = 8; i < parts.length; i += 2) {        if (parts[i] === 'raddr') {          candidate.relatedAddress = parts[i + 1]; // was: relAddr        } else if (parts[i] === 'rport') {          candidate.relatedPort = parseInt(parts[i + 1], 10); // was: relPort        } else if (parts[i] === 'generation') {          candidate.generation = parts[i + 1];        } else if (parts[i] === 'tcptype') {          candidate.tcpType = parts[i + 1];        }      }      return candidate;    };    // SDP helper from sdp-jingle-json with modifications.    window.RTCPeerConnection.prototype._toCandidateSDP = function(candidate) {      var sdp = [];      sdp.push(candidate.foundation);      sdp.push(candidate.component);      sdp.push(candidate.protocol.toUpperCase());      sdp.push(candidate.priority);      sdp.push(candidate.ip);      sdp.push(candidate.port);      var type = candidate.type;      sdp.push('typ');      sdp.push(type);      if (type === 'srflx' || type === 'prflx' || type === 'relay') {        if (candidate.relatedAddress && candidate.relatedPort) {          sdp.push('raddr');          sdp.push(candidate.relatedAddress); // was: relAddr          sdp.push('rport');          sdp.push(candidate.relatedPort); // was: relPort        }      }      if (candidate.tcpType && candidate.protocol.toUpperCase() === 'TCP') {        sdp.push('tcptype');        sdp.push(candidate.tcpType);      }      return 'a=candidate:' + sdp.join(' ');    };    // SDP helper from sdp-jingle-json with modifications.    window.RTCPeerConnection.prototype._parseRtpMap = function(line) {      var parts = line.substr(9).split(' ');      var parsed = {        payloadType: parseInt(parts.shift(), 10) // was: id      };      parts = parts[0].split('/');      parsed.name = parts[0];      parsed.clockRate = parseInt(parts[1], 10); // was: clockrate      parsed.numChannels = parts.length === 3 ? parseInt(parts[2], 10) : 1; // was: channels      return parsed;    };    // Parses SDP to determine capabilities.    window.RTCPeerConnection.prototype._getRemoteCapabilities =        function(section) {      var remoteCapabilities = {        codecs: [],        headerExtensions: [],        fecMechanisms: []      };      var i;      var lines = section.split('\r\n');      var mline = lines[0].substr(2).split(' ');      var rtpmapFilter = function(line) {        return line.indexOf('a=rtpmap:' + mline[i]) === 0;      };      var fmtpFilter = function(line) {        return line.indexOf('a=fmtp:' + mline[i]) === 0;      };      var parseFmtp = function(line) {        var parsed = {};        var kv;        var parts = line.substr(('a=fmtp:' + mline[i]).length + 1).split(';');        for (var j = 0; j < parts.length; j++) {          kv = parts[j].split('=');          parsed[kv[0].trim()] = kv[1];        }        console.log('fmtp', mline[i], parsed);        return parsed;      };      var rtcpFbFilter = function(line) {        return line.indexOf('a=rtcp-fb:' + mline[i]) === 0;      };      var parseRtcpFb = function(line) {        var parts = line.substr(('a=rtcp-fb:' + mline[i]).length + 1)            .split(' ');        return {          type: parts.shift(),          parameter: parts.join(' ')        };      };      for (i = 3; i < mline.length; i++) { // find all codecs from mline[3..]        var line = lines.filter(rtpmapFilter)[0];        if (line) {          var codec = this._parseRtpMap(line);          var fmtp = lines.filter(fmtpFilter);          codec.parameters = fmtp.length ? parseFmtp(fmtp[0]) : {};          codec.rtcpFeedback = lines.filter(rtcpFbFilter).map(parseRtcpFb);          remoteCapabilities.codecs.push(codec);        }      }      return remoteCapabilities;    };    // Serializes capabilities to SDP.    window.RTCPeerConnection.prototype._capabilitiesToSDP = function(caps) {      var sdp = '';      caps.codecs.forEach(function(codec) {        var pt = codec.payloadType;        if (codec.preferredPayloadType !== undefined) {          pt = codec.preferredPayloadType;        }        sdp += 'a=rtpmap:' + pt +            ' ' + codec.name +            '/' + codec.clockRate +            (codec.numChannels !== 1 ? '/' + codec.numChannels : '') +            '\r\n';        if (codec.parameters && codec.parameters.length) {          sdp += 'a=ftmp:' + pt + ' ';          Object.keys(codec.parameters).forEach(function(param) {            sdp += param + '=' + codec.parameters[param];          });          sdp += '\r\n';        }        if (codec.rtcpFeedback) {          // FIXME: special handling for trr-int?          codec.rtcpFeedback.forEach(function(fb) {            sdp += 'a=rtcp-fb:' + pt + ' ' + fb.type + ' ' +                fb.parameter + '\r\n';          });        }      });      return sdp;    };    // Calculates the intersection of local and remote capabilities.    window.RTCPeerConnection.prototype._getCommonCapabilities =        function(localCapabilities, remoteCapabilities) {      var commonCapabilities = {        codecs: [],        headerExtensions: [],        fecMechanisms: []      };      localCapabilities.codecs.forEach(function(lCodec) {        for (var i = 0; i < remoteCapabilities.codecs.length; i++) {          var rCodec = remoteCapabilities.codecs[i];          if (lCodec.name === rCodec.name &&              lCodec.clockRate === rCodec.clockRate &&              lCodec.numChannels === rCodec.numChannels) {            // push rCodec so we reply with offerer payload type            commonCapabilities.codecs.push(rCodec);            // FIXME: also need to calculate intersection between            // .rtcpFeedback and .parameters            break;          }        }      });      localCapabilities.headerExtensions.forEach(function(lHeaderExtension) {        for (var i = 0; i < remoteCapabilities.headerExtensions.length; i++) {          var rHeaderExtension = remoteCapabilities.headerExtensions[i];          if (lHeaderExtension.uri === rHeaderExtension.uri) {            commonCapabilities.headerExtensions.push(rHeaderExtension);            break;          }        }      });      // FIXME: fecMechanisms      return commonCapabilities;    };    // Parses DTLS parameters from SDP section or sessionpart.    window.RTCPeerConnection.prototype._getDtlsParameters =        function(section, session) {      var lines = section.split('\r\n');      lines = lines.concat(session.split('\r\n')); // Search in session part, too.      var fpLine = lines.filter(function(line) {        return line.indexOf('a=fingerprint:') === 0;      });      fpLine = fpLine[0].substr(14);      var dtlsParameters = {        role: 'auto',        fingerprints: [{          algorithm: fpLine.split(' ')[0],          value: fpLine.split(' ')[1]        }]      };      return dtlsParameters;    };    // Serializes DTLS parameters to SDP.    window.RTCPeerConnection.prototype._dtlsParametersToSDP =        function(params, setupType) {      var sdp = 'a=setup:' + setupType + '\r\n';      params.fingerprints.forEach(function(fp) {        sdp += 'a=fingerprint:' + fp.algorithm + ' ' + fp.value + '\r\n';      });      return sdp;    };    // Parses ICE information from SDP section or sessionpart.    window.RTCPeerConnection.prototype._getIceParameters =        function(section, session) {      var lines = section.split('\r\n');      lines = lines.concat(session.split('\r\n')); // Search in session part, too.      var iceParameters = {        usernameFragment: lines.filter(function(line) {          return line.indexOf('a=ice-ufrag:') === 0;        })[0].substr(12),        password: lines.filter(function(line) {          return line.indexOf('a=ice-pwd:') === 0;        })[0].substr(10),      };      return iceParameters;    };    // Serializes ICE parameters to SDP.    window.RTCPeerConnection.prototype._iceParametersToSDP = function(params) {      return 'a=ice-ufrag:' + params.usernameFragment + '\r\n' +          'a=ice-pwd:' + params.password + '\r\n';    };    window.RTCPeerConnection.prototype._getEncodingParameters = function(ssrc) {      return {        ssrc: ssrc,        codecPayloadType: 0,        fec: 0,        rtx: 0,        priority: 1.0,        maxBitrate: 2000000.0,        minQuality: 0,        framerateBias: 0.5,        resolutionScale: 1.0,        framerateScale: 1.0,        active: true,        dependencyEncodingId: undefined,        encodingId: undefined      };    };    // Create ICE gatherer, ICE transport and DTLS transport.    window.RTCPeerConnection.prototype._createIceAndDtlsTransports =        function(mid, sdpMLineIndex) {      var self = this;      var iceGatherer = new RTCIceGatherer(self.iceOptions);      var iceTransport = new RTCIceTransport(iceGatherer);      iceGatherer.onlocalcandidate = function(evt) {        var event = {};        event.candidate = {sdpMid: mid, sdpMLineIndex: sdpMLineIndex};        var cand = evt.candidate;        var isEndOfCandidates = !(cand && Object.keys(cand).length > 0);        if (isEndOfCandidates) {          event.candidate.candidate =              'candidate:1 1 udp 1 0.0.0.0 9 typ endOfCandidates';        } else {          // RTCIceCandidate doesn't have a component, needs to be added          cand.component = iceTransport.component === 'RTCP' ? 2 : 1;          event.candidate.candidate = self._toCandidateSDP(cand);        }        if (self.onicecandidate !== null) {          if (self.localDescription && self.localDescription.type === '') {            self._iceCandidates.push(event);          } else {            self.onicecandidate(event);          }        }      };      iceTransport.onicestatechange = function() {        /*        console.log(self._peerConnectionId,            'ICE state change', iceTransport.state);        */        self._updateIceConnectionState(iceTransport.state);      };      var dtlsTransport = new RTCDtlsTransport(iceTransport);      dtlsTransport.ondtlsstatechange = function() {        /*        console.log(self._peerConnectionId, sdpMLineIndex,            'dtls state change', dtlsTransport.state);        */      };      dtlsTransport.onerror = function(error) {        console.error('dtls error', error);      };      return {        iceGatherer: iceGatherer,        iceTransport: iceTransport,        dtlsTransport: dtlsTransport      };    };    window.RTCPeerConnection.prototype.setLocalDescription =        function(description) {      var self = this;      if (description.type === 'offer') {        if (!description.ortc) {          // FIXME: throw?        } else {          this.mLines = description.ortc;        }      } else if (description.type === 'answer') {        var sections = self.remoteDescription.sdp.split('\r\nm=');        var sessionpart = sections.shift();        sections.forEach(function(section, sdpMLineIndex) {          section = 'm=' + section;          var iceGatherer = self.mLines[sdpMLineIndex].iceGatherer;          var iceTransport = self.mLines[sdpMLineIndex].iceTransport;          var dtlsTransport = self.mLines[sdpMLineIndex].dtlsTransport;          var rtpSender = self.mLines[sdpMLineIndex].rtpSender;          var localCapabilities =              self.mLines[sdpMLineIndex].localCapabilities;          var remoteCapabilities =              self.mLines[sdpMLineIndex].remoteCapabilities;          var sendSSRC = self.mLines[sdpMLineIndex].sendSSRC;          var recvSSRC = self.mLines[sdpMLineIndex].recvSSRC;          var remoteIceParameters = self._getIceParameters(section,              sessionpart);          iceTransport.start(iceGatherer, remoteIceParameters, 'controlled');          var remoteDtlsParameters = self._getDtlsParameters(section,              sessionpart);          dtlsTransport.start(remoteDtlsParameters);          if (rtpSender) {            // calculate intersection of capabilities            var params = self._getCommonCapabilities(localCapabilities,                remoteCapabilities);            params.muxId = sendSSRC;            params.encodings = [self._getEncodingParameters(sendSSRC)];            params.rtcp = {              cname: self._cname,              reducedSize: false,              ssrc: recvSSRC,              mux: true            };            rtpSender.send(params);          }        });      }      this.localDescription = description;      switch (description.type) {      case 'offer':        this._updateSignalingState('have-local-offer');        break;      case 'answer':        this._updateSignalingState('stable');        break;      }      // FIXME: need to _reliably_ execute after args[1] or promise      window.setTimeout(function() {        // FIXME: need to apply ice candidates in a way which is async but in-order        self._iceCandidates.forEach(function(event) {          if (self.onicecandidate !== null) {            self.onicecandidate(event);          }        });        self._iceCandidates = [];      }, 50);      if (arguments.length > 1 && typeof arguments[1] === 'function') {        window.setTimeout(arguments[1], 0);      }      return new Promise(function(resolve) {        resolve();      });    };    window.RTCPeerConnection.prototype.setRemoteDescription =        function(description) {      // FIXME: for type=offer this creates state. which should not      //  happen before SLD with type=answer but... we need the stream      //  here for onaddstream.      var self = this;      var sections = description.sdp.split('\r\nm=');      var sessionpart = sections.shift();      var stream = new MediaStream();      sections.forEach(function(section, sdpMLineIndex) {        section = 'm=' + section;        var lines = section.split('\r\n');        var mline = lines[0].substr(2).split(' ');        var kind = mline[0];        var line;        var iceGatherer;        var iceTransport;        var dtlsTransport;        var rtpSender;        var rtpReceiver;        var sendSSRC;        var recvSSRC;        var mid = lines.filter(function(line) {          return line.indexOf('a=mid:') === 0;        })[0].substr(6);        var cname;        var remoteCapabilities;        var params;        if (description.type === 'offer') {          var transports = self._createIceAndDtlsTransports(mid, sdpMLineIndex);          var localCapabilities = RTCRtpReceiver.getCapabilities(kind);          // determine remote caps from SDP          remoteCapabilities = self._getRemoteCapabilities(section);          line = lines.filter(function(line) {            return line.indexOf('a=ssrc:') === 0 &&                line.split(' ')[1].indexOf('cname:') === 0;          });          sendSSRC = (2 * sdpMLineIndex + 2) * 1001;          if (line) { // FIXME: alot of assumptions here            recvSSRC = line[0].split(' ')[0].split(':')[1];            cname = line[0].split(' ')[1].split(':')[1];          }          rtpReceiver = new RTCRtpReceiver(transports.dtlsTransport, kind);          // calculate intersection so no unknown caps get passed into the RTPReciver          params = self._getCommonCapabilities(localCapabilities,              remoteCapabilities);          params.muxId = recvSSRC;          params.encodings = [self._getEncodingParameters(recvSSRC)];          params.rtcp = {            cname: cname,            reducedSize: false,            ssrc: sendSSRC,            mux: true          };          rtpReceiver.receive(params);          // FIXME: not correct when there are multiple streams but that is          // not currently supported.          stream.addTrack(rtpReceiver.track);          // FIXME: honor a=sendrecv          if (self.localStreams.length > 0 &&              self.localStreams[0].getTracks().length >= sdpMLineIndex) {            // FIXME: actually more complicated, needs to match types etc            var localtrack = self.localStreams[0].getTracks()[sdpMLineIndex];            rtpSender = new RTCRtpSender(localtrack, transports.dtlsTransport);          }          self.mLines[sdpMLineIndex] = {            iceGatherer: transports.iceGatherer,            iceTransport: transports.iceTransport,            dtlsTransport: transports.dtlsTransport,            localCapabilities: localCapabilities,            remoteCapabilities: remoteCapabilities,            rtpSender: rtpSender,            rtpReceiver: rtpReceiver,            kind: kind,            mid: mid,            sendSSRC: sendSSRC,            recvSSRC: recvSSRC          };        } else {          iceGatherer = self.mLines[sdpMLineIndex].iceGatherer;          iceTransport = self.mLines[sdpMLineIndex].iceTransport;          dtlsTransport = self.mLines[sdpMLineIndex].dtlsTransport;          rtpSender = self.mLines[sdpMLineIndex].rtpSender;          rtpReceiver = self.mLines[sdpMLineIndex].rtpReceiver;          sendSSRC = self.mLines[sdpMLineIndex].sendSSRC;          recvSSRC = self.mLines[sdpMLineIndex].recvSSRC;        }        var remoteIceParameters = self._getIceParameters(section, sessionpart);        var remoteDtlsParameters = self._getDtlsParameters(section,            sessionpart);        // for answers we start ice and dtls here, otherwise this is done in SLD        if (description.type === 'answer') {          iceTransport.start(iceGatherer, remoteIceParameters, 'controlling');          dtlsTransport.start(remoteDtlsParameters);          // determine remote caps from SDP          remoteCapabilities = self._getRemoteCapabilities(section);          // FIXME: store remote caps?          if (rtpSender) {            params = remoteCapabilities;            params.muxId = sendSSRC;            params.encodings = [self._getEncodingParameters(sendSSRC)];            params.rtcp = {              cname: self._cname,              reducedSize: false,              ssrc: recvSSRC,              mux: true            };            rtpSender.send(params);          }          // FIXME: only if a=sendrecv          var bidi = lines.filter(function(line) {            return line.indexOf('a=ssrc:') === 0;          }).length > 0;          if (rtpReceiver && bidi) {            line = lines.filter(function(line) {              return line.indexOf('a=ssrc:') === 0 &&                  line.split(' ')[1].indexOf('cname:') === 0;            });            if (line) { // FIXME: alot of assumptions here              recvSSRC = line[0].split(' ')[0].split(':')[1];              cname = line[0].split(' ')[1].split(':')[1];            }            params = remoteCapabilities;            params.muxId = recvSSRC;            params.encodings = [self._getEncodingParameters(recvSSRC)];            params.rtcp = {              cname: cname,              reducedSize: false,              ssrc: sendSSRC,              mux: true            };            rtpReceiver.receive(params, kind);            stream.addTrack(rtpReceiver.track);            self.mLines[sdpMLineIndex].recvSSRC = recvSSRC;          }        }      });      this.remoteDescription = description;      switch (description.type) {      case 'offer':        this._updateSignalingState('have-remote-offer');        break;      case 'answer':        this._updateSignalingState('stable');        break;      }      window.setTimeout(function() {        if (self.onaddstream !== null && stream.getTracks().length) {          self.remoteStreams.push(stream);          window.setTimeout(function() {            self.onaddstream({stream: stream});          }, 0);        }      }, 0);      if (arguments.length > 1 && typeof arguments[1] === 'function') {        window.setTimeout(arguments[1], 0);      }      return new Promise(function(resolve) {        resolve();      });    };    window.RTCPeerConnection.prototype.close = function() {      this.mLines.forEach(function(mLine) {        /* not yet        if (mLine.iceGatherer) {          mLine.iceGatherer.close();        }        */        if (mLine.iceTransport) {          mLine.iceTransport.stop();        }        if (mLine.dtlsTransport) {          mLine.dtlsTransport.stop();        }        if (mLine.rtpSender) {          mLine.rtpSender.stop();        }        if (mLine.rtpReceiver) {          mLine.rtpReceiver.stop();        }      });      // FIXME: clean up tracks, local streams, remote streams, etc      this._updateSignalingState('closed');      this._updateIceConnectionState('closed');    };    // Update the signaling state.    window.RTCPeerConnection.prototype._updateSignalingState =        function(newState) {      this.signalingState = newState;      if (this.onsignalingstatechange !== null) {        this.onsignalingstatechange();      }    };    // Update the ICE connection state.    // FIXME: should be called 'updateConnectionState', also be called for    //  DTLS changes and implement    //  https://lists.w3.org/Archives/Public/public-webrtc/2015Sep/0033.html    window.RTCPeerConnection.prototype._updateIceConnectionState =        function(newState) {      var self = this;      if (this.iceConnectionState !== newState) {        var agreement = self.mLines.every(function(mLine) {          return mLine.iceTransport.state === newState;        });        if (agreement) {          self.iceConnectionState = newState;          if (this.oniceconnectionstatechange !== null) {            this.oniceconnectionstatechange();          }        }      }    };    window.RTCPeerConnection.prototype.createOffer = function() {      var self = this;      var offerOptions;      if (arguments.length === 1 && typeof arguments[0] !== 'function') {        offerOptions = arguments[0];      } else if (arguments.length === 3) {        offerOptions = arguments[2];      }      var tracks = [];      var numAudioTracks = 0;      var numVideoTracks = 0;      // Default to sendrecv.      if (this.localStreams.length) {        numAudioTracks = this.localStreams[0].getAudioTracks().length;        numVideoTracks = this.localStreams[0].getAudioTracks().length;      }      // Determine number of audio and video tracks we need to send/recv.      if (offerOptions) {        // Deal with Chrome legacy constraints...        if (offerOptions.mandatory) {          if (offerOptions.mandatory.OfferToReceiveAudio) {            numAudioTracks = 1;          } else if (offerOptions.mandatory.OfferToReceiveAudio === false) {            numAudioTracks = 0;          }          if (offerOptions.mandatory.OfferToReceiveVideo) {            numVideoTracks = 1;          } else if (offerOptions.mandatory.OfferToReceiveVideo === false) {            numVideoTracks = 0;          }        } else {          if (offerOptions.offerToReceiveAudio !== undefined) {            numAudioTracks = offerOptions.offerToReceiveAudio;          }          if (offerOptions.offerToReceiveVideo !== undefined) {            numVideoTracks = offerOptions.offerToReceiveVideo;          }        }      }      if (this.localStreams.length) {        // Push local streams.        this.localStreams[0].getTracks().forEach(function(track) {          tracks.push({            kind: track.kind,            track: track,            wantReceive: track.kind === 'audio' ?                numAudioTracks > 0 : numVideoTracks > 0          });          if (track.kind === 'audio') {            numAudioTracks--;          } else if (track.kind === 'video') {            numVideoTracks--;          }        });      }      // Create M-lines for recvonly streams.      while (numAudioTracks > 0 || numVideoTracks > 0) {        if (numAudioTracks > 0) {          tracks.push({            kind: 'audio',            wantReceive: true          });          numAudioTracks--;        }        if (numVideoTracks > 0) {          tracks.push({            kind: 'video',            wantReceive: true          });          numVideoTracks--;        }      }      var sdp = 'v=0\r\n' +          'o=thisisadapterortc 8169639915646943137 2 IN IP4 127.0.0.1\r\n' +          's=-\r\n' +          't=0 0\r\n';      var mLines = [];      tracks.forEach(function(mline, sdpMLineIndex) {        // For each track, create an ice gatherer, ice transport, dtls transport,        // potentially rtpsender and rtpreceiver.        var track = mline.track;        var kind = mline.kind;        var mid = Math.random().toString(36).substr(2, 10);        var transports = self._createIceAndDtlsTransports(mid, sdpMLineIndex);        var localCapabilities = RTCRtpSender.getCapabilities(kind);        var rtpSender;        // generate an ssrc now, to be used later in rtpSender.send        var sendSSRC = (2 * sdpMLineIndex + 1) * 1001; //Math.floor(Math.random()*4294967295);        var recvSSRC; // don't know yet        if (track) {          rtpSender = new RTCRtpSender(track, transports.dtlsTransport);        }        var rtpReceiver;        if (mline.wantReceive) {          rtpReceiver = new RTCRtpReceiver(transports.dtlsTransport, kind);        }        mLines[sdpMLineIndex] = {          iceGatherer: transports.iceGatherer,          iceTransport: transports.iceTransport,          dtlsTransport: transports.dtlsTransport,          localCapabilities: localCapabilities,          remoteCapabilities: null,          rtpSender: rtpSender,          rtpReceiver: rtpReceiver,          kind: kind,          mid: mid,          sendSSRC: sendSSRC,          recvSSRC: recvSSRC        };        // Map things to SDP.        // Build the mline.        sdp += 'm=' + kind + ' 9 UDP/TLS/RTP/SAVPF ';        sdp += localCapabilities.codecs.map(function(codec) {          return codec.preferredPayloadType;        }).join(' ') + '\r\n';        sdp += 'c=IN IP4 0.0.0.0\r\n';        sdp += 'a=rtcp:9 IN IP4 0.0.0.0\r\n';        // Map ICE parameters (ufrag, pwd) to SDP.        sdp += self._iceParametersToSDP(            transports.iceGatherer.getLocalParameters());        // Map DTLS parameters to SDP.        sdp += self._dtlsParametersToSDP(            transports.dtlsTransport.getLocalParameters(), 'actpass');        sdp += 'a=mid:' + mid + '\r\n';        if (rtpSender && rtpReceiver) {          sdp += 'a=sendrecv\r\n';        } else if (rtpSender) {          sdp += 'a=sendonly\r\n';        } else if (rtpReceiver) {          sdp += 'a=recvonly\r\n';        } else {          sdp += 'a=inactive\r\n';        }        sdp += 'a=rtcp-mux\r\n';        // Add a=rtpmap lines for each codec. Also fmtp and rtcp-fb.        sdp += self._capabilitiesToSDP(localCapabilities);        if (track) {          sdp += 'a=msid:' + self.localStreams[0].id + ' ' + track.id + '\r\n';          sdp += 'a=ssrc:' + sendSSRC + ' ' + 'msid:' +              self.localStreams[0].id + ' ' + track.id + '\r\n';        }        sdp += 'a=ssrc:' + sendSSRC + ' cname:' + self._cname + '\r\n';      });      var desc = new RTCSessionDescription({        type: 'offer',        sdp: sdp,        ortc: mLines      });      if (arguments.length && typeof arguments[0] === 'function') {        window.setTimeout(arguments[0], 0, desc);      }      return new Promise(function(resolve) {        resolve(desc);      });    };    window.RTCPeerConnection.prototype.createAnswer = function() {      var self = this;      var answerOptions;      if (arguments.length === 1 && typeof arguments[0] !== 'function') {        answerOptions = arguments[0];      } else if (arguments.length === 3) {        answerOptions = arguments[2];      }      var sdp = 'v=0\r\n' +          'o=thisisadapterortc 8169639915646943137 2 IN IP4 127.0.0.1\r\n' +          's=-\r\n' +          't=0 0\r\n';      this.mLines.forEach(function(mLine/*, sdpMLineIndex*/) {        var iceGatherer = mLine.iceGatherer;        //var iceTransport = mLine.iceTransport;        var dtlsTransport = mLine.dtlsTransport;        var localCapabilities = mLine.localCapabilities;        var remoteCapabilities = mLine.remoteCapabilities;        var rtpSender = mLine.rtpSender;        var rtpReceiver = mLine.rtpReceiver;        var kind = mLine.kind;        var sendSSRC = mLine.sendSSRC;        //var recvSSRC = mLine.recvSSRC;        // Calculate intersection of capabilities.        var commonCapabilities = self._getCommonCapabilities(localCapabilities,            remoteCapabilities);        // Map things to SDP.        // Build the mline.        sdp += 'm=' + kind + ' 9 UDP/TLS/RTP/SAVPF ';        sdp += commonCapabilities.codecs.map(function(codec) {          return codec.payloadType;        }).join(' ') + '\r\n';        sdp += 'c=IN IP4 0.0.0.0\r\n';        sdp += 'a=rtcp:9 IN IP4 0.0.0.0\r\n';        // Map ICE parameters (ufrag, pwd) to SDP.        sdp += self._iceParametersToSDP(iceGatherer.getLocalParameters());        // Map DTLS parameters to SDP.        sdp += self._dtlsParametersToSDP(dtlsTransport.getLocalParameters(),            'active');        sdp += 'a=mid:' + mLine.mid + '\r\n';        if (rtpSender && rtpReceiver) {          sdp += 'a=sendrecv\r\n';        } else if (rtpReceiver) {          sdp += 'a=sendonly\r\n';        } else if (rtpSender) {          sdp += 'a=sendonly\r\n';        } else {          sdp += 'a=inactive\r\n';        }        sdp += 'a=rtcp-mux\r\n';        // Add a=rtpmap lines for each codec. Also fmtp and rtcp-fb.        sdp += self._capabilitiesToSDP(commonCapabilities);        if (rtpSender) {          // add a=ssrc lines from RTPSender          sdp += 'a=msid:' + self.localStreams[0].id + ' ' +              rtpSender.track.id + '\r\n';          sdp += 'a=ssrc:' + sendSSRC + ' ' + 'msid:' +              self.localStreams[0].id + ' ' + rtpSender.track.id + '\r\n';        }        sdp += 'a=ssrc:' + sendSSRC + ' cname:' + self._cname + '\r\n';      });      var desc = new RTCSessionDescription({        type: 'answer',        sdp: sdp        // ortc: tracks -- state is created in SRD already      });      if (arguments.length && typeof arguments[0] === 'function') {        window.setTimeout(arguments[0], 0, desc);      }      return new Promise(function(resolve) {        resolve(desc);      });    };    window.RTCPeerConnection.prototype.addIceCandidate = function(candidate) {      // TODO: lookup by mid      var mLine = this.mLines[candidate.sdpMLineIndex];      if (mLine) {        var cand = Object.keys(candidate.candidate).length > 0 ?            this._toCandidateJSON(candidate.candidate) : {};        // dirty hack to make simplewebrtc work.        // FIXME: need another dirty hack to avoid adding candidates after this        if (cand.type === 'endOfCandidates') {          cand = {};        }        // dirty hack to make chrome work.        if (cand.protocol === 'tcp' && cand.port === 0) {          cand = {};        }        mLine.iceTransport.addRemoteCandidate(cand);      }      if (arguments.length > 1 && typeof arguments[1] === 'function') {        window.setTimeout(arguments[1], 0);      }      return new Promise(function(resolve) {        resolve();      });    };    window.RTCPeerConnection.prototype.getStats = function() {      var promises = [];      this.mLines.forEach(function(mLine) {        ['rtpSender', 'rtpReceiver', 'iceGatherer', 'iceTransport',            'dtlsTransport'].forEach(function(thing) {          if (mLine[thing]) {            promises.push(mLine[thing].getStats());          }        });      });      var cb = arguments.length > 1 && typeof arguments[1] === 'function' &&          arguments[1];      return new Promise(function(resolve) {        var results = {};        Promise.all(promises).then(function(res) {          res.forEach(function(result) {            Object.keys(result).forEach(function(id) {              results[id] = result[id];            });          });          if (cb) {            window.setTimeout(cb, 0, results);          }          resolve(results);        });      });    };  }} else {  webrtcUtils.log('Browser does not appear to be WebRTC-capable');}// Returns the result of getUserMedia as a Promise.function requestUserMedia(constraints) {  return new Promise(function(resolve, reject) {    getUserMedia(constraints, resolve, reject);  });}var webrtcTesting = {};try {  Object.defineProperty(webrtcTesting, 'version', {    set: function(version) {      webrtcDetectedVersion = version;    }  });} catch (e) {}if (typeof module !== 'undefined') {  var RTCPeerConnection;  if (typeof window !== 'undefined') {    RTCPeerConnection = window.RTCPeerConnection;  }  module.exports = {    RTCPeerConnection: RTCPeerConnection,    getUserMedia: getUserMedia,    attachMediaStream: attachMediaStream,    reattachMediaStream: reattachMediaStream,    webrtcDetectedBrowser: webrtcDetectedBrowser,    webrtcDetectedVersion: webrtcDetectedVersion,    webrtcMinimumVersion: webrtcMinimumVersion,    webrtcTesting: webrtcTesting,    webrtcUtils: webrtcUtils    //requestUserMedia: not exposed on purpose.    //trace: not exposed on purpose.  };} else if ((typeof require === 'function') && (typeof define === 'function')) {  // Expose objects and functions when RequireJS is doing the loading.  define([], function() {    return {      RTCPeerConnection: window.RTCPeerConnection,      getUserMedia: getUserMedia,      attachMediaStream: attachMediaStream,      reattachMediaStream: reattachMediaStream,      webrtcDetectedBrowser: webrtcDetectedBrowser,      webrtcDetectedVersion: webrtcDetectedVersion,      webrtcMinimumVersion: webrtcMinimumVersion,      webrtcTesting: webrtcTesting,      webrtcUtils: webrtcUtils      //requestUserMedia: not exposed on purpose.      //trace: not exposed on purpose.    };  });}
View Code

交互js代码

/* *  Copyright (c) 2015 The WebRTC project authors. All Rights Reserved. * *  Use of this source code is governed by a BSD-style license *  that can be found in the LICENSE file in the root of the source *  tree. */'use strict';var startButton = document.getElementById('startButton');var callButton = document.getElementById('callButton');var hangupButton = document.getElementById('hangupButton');callButton.disabled = true;hangupButton.disabled = true;startButton.onclick = start;callButton.onclick = call;hangupButton.onclick = hangup;var startTime;var localVideo = document.getElementById('localVideo');var remoteVideo = document.getElementById('remoteVideo');localVideo.addEventListener('loadedmetadata', function() {  trace('Local video videoWidth: ' + this.videoWidth +    'px,  videoHeight: ' + this.videoHeight + 'px');});remoteVideo.addEventListener('loadedmetadata', function() {  trace('Remote video videoWidth: ' + this.videoWidth +    'px,  videoHeight: ' + this.videoHeight + 'px');});remoteVideo.onresize = function() {  trace('Remote video size changed to ' +    remoteVideo.videoWidth + 'x' + remoteVideo.videoHeight);  // We'll use the first onsize callback as an indication that video has started  // playing out.  if (startTime) {    var elapsedTime = window.performance.now() - startTime;    trace('Setup time: ' + elapsedTime.toFixed(3) + 'ms');    startTime = null;  }};var localStream;var pc1;var pc2;var offerOptions = {  offerToReceiveAudio: 1,  offerToReceiveVideo: 1};function getName(pc) {  return (pc === pc1) ? 'pc1' : 'pc2';}function getOtherPc(pc) {  return (pc === pc1) ? pc2 : pc1;}function gotStream(stream) {  trace('Received local stream');  localVideo.srcObject = stream;  localStream = stream;  callButton.disabled = false;}function start() {  trace('Requesting local stream');  startButton.disabled = true;  navigator.mediaDevices.getUserMedia({    audio: true,    video: true  })  .then(gotStream)  .catch(function(e) {    alert('getUserMedia() error: ' + e.name);  });}function call() {  callButton.disabled = true;  hangupButton.disabled = false;  trace('Starting call');  startTime = window.performance.now();  var videoTracks = localStream.getVideoTracks();  var audioTracks = localStream.getAudioTracks();  if (videoTracks.length > 0) {    trace('Using video device: ' + videoTracks[0].label);  }  if (audioTracks.length > 0) {    trace('Using audio device: ' + audioTracks[0].label);  }  var servers = null;  pc1 = new RTCPeerConnection(servers);  trace('Created local peer connection object pc1');  pc1.onicecandidate = function(e) {    onIceCandidate(pc1, e);  };  pc2 = new RTCPeerConnection(servers);  trace('Created remote peer connection object pc2');  pc2.onicecandidate = function(e) {    onIceCandidate(pc2, e);  };  pc1.oniceconnectionstatechange = function(e) {    onIceStateChange(pc1, e);  };  pc2.oniceconnectionstatechange = function(e) {    onIceStateChange(pc2, e);  };  pc2.onaddstream = gotRemoteStream;  pc1.addStream(localStream);  trace('Added local stream to pc1');  trace('pc1 createOffer start');  pc1.createOffer(onCreateOfferSuccess, onCreateSessionDescriptionError,      offerOptions);}function onCreateSessionDescriptionError(error) {  trace('Failed to create session description: ' + error.toString());}function onCreateOfferSuccess(desc) {  trace('Offer from pc1\n' + desc.sdp);  trace('pc1 setLocalDescription start');  pc1.setLocalDescription(desc, function() {    onSetLocalSuccess(pc1);  }, onSetSessionDescriptionError);  trace('pc2 setRemoteDescription start');  pc2.setRemoteDescription(desc, function() {    onSetRemoteSuccess(pc2);  }, onSetSessionDescriptionError);  trace('pc2 createAnswer start');  // Since the 'remote' side has no media stream we need  // to pass in the right constraints in order for it to  // accept the incoming offer of audio and video.  pc2.createAnswer(onCreateAnswerSuccess, onCreateSessionDescriptionError);}function onSetLocalSuccess(pc) {  trace(getName(pc) + ' setLocalDescription complete');}function onSetRemoteSuccess(pc) {  trace(getName(pc) + ' setRemoteDescription complete');}function onSetSessionDescriptionError(error) {  trace('Failed to set session description: ' + error.toString());}function gotRemoteStream(e) {  remoteVideo.srcObject = e.stream;  trace('pc2 received remote stream');}function onCreateAnswerSuccess(desc) {  trace('Answer from pc2:\n' + desc.sdp);  trace('pc2 setLocalDescription start');  pc2.setLocalDescription(desc, function() {    onSetLocalSuccess(pc2);  }, onSetSessionDescriptionError);  trace('pc1 setRemoteDescription start');  pc1.setRemoteDescription(desc, function() {    onSetRemoteSuccess(pc1);  }, onSetSessionDescriptionError);}function onIceCandidate(pc, event) {  if (event.candidate) {    getOtherPc(pc).addIceCandidate(new RTCIceCandidate(event.candidate),        function() {          onAddIceCandidateSuccess(pc);        },        function(err) {          onAddIceCandidateError(pc, err);        }    );    trace(getName(pc) + ' ICE candidate: \n' + event.candidate.candidate);  }}function onAddIceCandidateSuccess(pc) {  trace(getName(pc) + ' addIceCandidate success');}function onAddIceCandidateError(pc, error) {  trace(getName(pc) + ' failed to add ICE Candidate: ' + error.toString());}function onIceStateChange(pc, event) {  if (pc) {    trace(getName(pc) + ' ICE state: ' + pc.iceConnectionState);    console.log('ICE state change event: ', event);  }}function hangup() {  trace('Ending call');  pc1.close();  pc2.close();  pc1 = null;  pc2 = null;  hangupButton.disabled = true;  callButton.disabled = false;}
View Code

 

本地,呼叫者

// servers是配置文件(TURN and STUN配置)pc1 = new webkitRTCPeerConnection(servers);// ...pc1.addStream(localStream);

 

远端,被叫着

创建一个offer ,将其设定为PC1的局部描述(local description),PC2远程描述(remote description)。

这样可以不使用信令通讯,因为主叫和被叫都在同一网页上。

pc1.createOffer(gotDescription1);//...function gotDescription1(desc){  pc1.setLocalDescription(desc);  trace("Offer from pc1 \n" + desc.sdp);  pc2.setRemoteDescription(desc);  pc2.createAnswer(gotDescription2);}

创建pc2,当pc1产生视频流,则显示在remoteVideo视频控件(video element):

pc2 = new webkitRTCPeerConnection(servers);pc2.onaddstream = gotRemoteStream;//...function gotRemoteStream(e){  remoteVideo.src = URL.createObjectURL(e.stream);  trace('pc2 received remote stream');}

 

运行结果:

Navigated to https://webrtc.github.io/samples/src/content/peerconnection/pc1/adapter.js:32 This appears to be Chromecommon.js:8 12.639: Requesting local streamadapter.js:32 chrome: {"audio":true,"video":true}common.js:8 12.653: Received local streamcommon.js:8 14.038: Local video videoWidth: 640px,  videoHeight: 480pxcommon.js:8 15.183: Starting callcommon.js:8 15.183: Using video device: Integrated Camera (04f2:b39a)common.js:8 15.183: Using audio device: 默认common.js:8 15.185: Created local peer connection object pc1common.js:8 15.186: Created remote peer connection object pc2common.js:8 15.186: Added local stream to pc1common.js:8 15.187: pc1 createOffer startcommon.js:8 15.190: Offer from pc1v=0o=- 5740173043645401541 2 IN IP4 127.0.0.1s=-t=0 0a=group:BUNDLE audio videoa=msid-semantic: WMS ZWvBmXl2Dax58ugXR3BYDITTKIIV1TYPqViTm=audio 9 UDP/TLS/RTP/SAVPF 111 103 104 9 0 8 106 105 13 126c=IN IP4 0.0.0.0a=rtcp:9 IN IP4 0.0.0.0a=ice-ufrag:MOApAbo/PL8Jl3m9a=ice-pwd:dxcuXVAFcyVqbgwi0QdQNh0Sa=fingerprint:sha-256 5F:CB:FF:EF:73:09:BC:0A:6F:18:0C:DB:11:A5:AE:AF:37:49:37:71:D0:FE:BA:39:EC:53:6B:10:8C:8A:95:9Ea=setup:actpassa=mid:audioa=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-levela=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-timea=sendrecva=rtcp-muxa=rtpmap:111 opus/48000/2a=fmtp:111 minptime=10; useinbandfec=1a=rtpmap:103 ISAC/16000a=rtpmap:104 ISAC/32000a=rtpmap:9 G722/8000a=rtpmap:0 PCMU/8000a=rtpmap:8 PCMA/8000a=rtpmap:106 CN/32000a=rtpmap:105 CN/16000a=rtpmap:13 CN/8000a=rtpmap:126 telephone-event/8000a=maxptime:60a=ssrc:32244674 cname:anS0gTF+aWAKlwYja=ssrc:32244674 msid:ZWvBmXl2Dax58ugXR3BYDITTKIIV1TYPqViT 8ae8dd85-bd5c-49ff-a9bd-f4b88f2663c7a=ssrc:32244674 mslabel:ZWvBmXl2Dax58ugXR3BYDITTKIIV1TYPqViTa=ssrc:32244674 label:8ae8dd85-bd5c-49ff-a9bd-f4b88f2663c7m=video 9 UDP/TLS/RTP/SAVPF 100 116 117 96c=IN IP4 0.0.0.0a=rtcp:9 IN IP4 0.0.0.0a=ice-ufrag:MOApAbo/PL8Jl3m9a=ice-pwd:dxcuXVAFcyVqbgwi0QdQNh0Sa=fingerprint:sha-256 5F:CB:FF:EF:73:09:BC:0A:6F:18:0C:DB:11:A5:AE:AF:37:49:37:71:D0:FE:BA:39:EC:53:6B:10:8C:8A:95:9Ea=setup:actpassa=mid:videoa=extmap:2 urn:ietf:params:rtp-hdrext:toffseta=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-timea=extmap:4 urn:3gpp:video-orientationa=sendrecva=rtcp-muxa=rtpmap:100 VP8/90000a=rtcp-fb:100 ccm fira=rtcp-fb:100 nacka=rtcp-fb:100 nack plia=rtcp-fb:100 goog-remba=rtpmap:116 red/90000a=rtpmap:117 ulpfec/90000a=rtpmap:96 rtx/90000a=fmtp:96 apt=100a=ssrc-group:FID 1099776253 671187929a=ssrc:1099776253 cname:anS0gTF+aWAKlwYja=ssrc:1099776253 msid:ZWvBmXl2Dax58ugXR3BYDITTKIIV1TYPqViT eda73070-3562-4daf-ae0d-143694f294d5a=ssrc:1099776253 mslabel:ZWvBmXl2Dax58ugXR3BYDITTKIIV1TYPqViTa=ssrc:1099776253 label:eda73070-3562-4daf-ae0d-143694f294d5a=ssrc:671187929 cname:anS0gTF+aWAKlwYja=ssrc:671187929 msid:ZWvBmXl2Dax58ugXR3BYDITTKIIV1TYPqViT eda73070-3562-4daf-ae0d-143694f294d5a=ssrc:671187929 mslabel:ZWvBmXl2Dax58ugXR3BYDITTKIIV1TYPqViTa=ssrc:671187929 label:eda73070-3562-4daf-ae0d-143694f294d5common.js:8 15.190: pc1 setLocalDescription startcommon.js:8 15.191: pc2 setRemoteDescription startcommon.js:8 15.192: pc2 createAnswer startcommon.js:8 15.202: pc1 setLocalDescription completecommon.js:8 15.203: pc1 ICE candidate: candidate:2999745851 1 udp 2122260223 192.168.56.1 64106 typ host generation 0common.js:8 15.204: pc1 ICE candidate: candidate:1425577752 1 udp 2122194687 172.17.22.106 64107 typ host generation 0common.js:8 15.204: pc1 ICE candidate: candidate:2733511545 1 udp 2122129151 192.168.127.1 64108 typ host generation 0common.js:8 15.204: pc1 ICE candidate: candidate:1030387485 1 udp 2122063615 192.168.204.1 64109 typ host generation 0common.js:8 15.205: pc1 ICE candidate: candidate:3003979406 1 udp 2121998079 172.17.26.47 64110 typ host generation 0common.js:8 15.206: pc2 setRemoteDescription completecommon.js:8 15.206: Answer from pc2:v=0o=- 3554329696104028001 2 IN IP4 127.0.0.1s=-t=0 0a=group:BUNDLE audio videoa=msid-semantic: WMSm=audio 9 UDP/TLS/RTP/SAVPF 111 103 104 9 0 8 106 105 13 126c=IN IP4 0.0.0.0a=rtcp:9 IN IP4 0.0.0.0a=ice-ufrag:raDEDz+tkFTWXMn8a=ice-pwd:RQ8bf7mtOXHIDQ0/vF25IRMfa=fingerprint:sha-256 5F:CB:FF:EF:73:09:BC:0A:6F:18:0C:DB:11:A5:AE:AF:37:49:37:71:D0:FE:BA:39:EC:53:6B:10:8C:8A:95:9Ea=setup:activea=mid:audioa=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-levela=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-timea=recvonlya=rtcp-muxa=rtpmap:111 opus/48000/2a=fmtp:111 minptime=10; useinbandfec=1a=rtpmap:103 ISAC/16000a=rtpmap:104 ISAC/32000a=rtpmap:9 G722/8000a=rtpmap:0 PCMU/8000a=rtpmap:8 PCMA/8000a=rtpmap:106 CN/32000a=rtpmap:105 CN/16000a=rtpmap:13 CN/8000a=rtpmap:126 telephone-event/8000a=maxptime:60m=video 9 UDP/TLS/RTP/SAVPF 100 116 117 96c=IN IP4 0.0.0.0a=rtcp:9 IN IP4 0.0.0.0a=ice-ufrag:raDEDz+tkFTWXMn8a=ice-pwd:RQ8bf7mtOXHIDQ0/vF25IRMfa=fingerprint:sha-256 5F:CB:FF:EF:73:09:BC:0A:6F:18:0C:DB:11:A5:AE:AF:37:49:37:71:D0:FE:BA:39:EC:53:6B:10:8C:8A:95:9Ea=setup:activea=mid:videoa=extmap:2 urn:ietf:params:rtp-hdrext:toffseta=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-timea=extmap:4 urn:3gpp:video-orientationa=recvonlya=rtcp-muxa=rtpmap:100 VP8/90000a=rtcp-fb:100 ccm fira=rtcp-fb:100 nacka=rtcp-fb:100 nack plia=rtcp-fb:100 goog-remba=rtpmap:116 red/90000a=rtpmap:117 ulpfec/90000a=rtpmap:96 rtx/90000a=fmtp:96 apt=100common.js:8 15.207: pc2 setLocalDescription startcommon.js:8 15.207: pc1 setRemoteDescription startcommon.js:8 15.208: pc2 received remote stream2common.js:8 15.209: pc1 addIceCandidate success3common.js:8 15.210: pc1 addIceCandidate successcommon.js:8 15.224: pc1 ICE candidate: candidate:2999745851 2 udp 2122260222 192.168.56.1 64111 typ host generation 0common.js:8 15.224: pc1 ICE candidate: candidate:1425577752 2 udp 2122194686 172.17.22.106 64112 typ host generation 0common.js:8 15.225: pc1 ICE candidate: candidate:2733511545 2 udp 2122129150 192.168.127.1 64113 typ host generation 0common.js:8 15.225: pc1 ICE candidate: candidate:1030387485 2 udp 2122063614 192.168.204.1 64114 typ host generation 0common.js:8 15.226: pc1 ICE candidate: candidate:3003979406 2 udp 2121998078 172.17.26.47 64115 typ host generation 0common.js:8 15.226: pc1 ICE candidate: candidate:2999745851 1 udp 2122260223 192.168.56.1 64116 typ host generation 0common.js:8 15.227: pc1 ICE candidate: candidate:1425577752 1 udp 2122194687 172.17.22.106 64117 typ host generation 0common.js:8 15.227: pc1 ICE candidate: candidate:2733511545 1 udp 2122129151 192.168.127.1 64118 typ host generation 0common.js:8 15.227: pc1 ICE candidate: candidate:1030387485 1 udp 2122063615 192.168.204.1 64119 typ host generation 0common.js:8 15.228: pc1 ICE candidate: candidate:3003979406 1 udp 2121998079 172.17.26.47 64120 typ host generation 0common.js:8 15.228: pc1 ICE candidate: candidate:2999745851 2 udp 2122260222 192.168.56.1 64121 typ host generation 0common.js:8 15.228: pc1 ICE candidate: candidate:1425577752 2 udp 2122194686 172.17.22.106 64122 typ host generation 0common.js:8 15.229: pc1 ICE candidate: candidate:2733511545 2 udp 2122129150 192.168.127.1 64123 typ host generation 0common.js:8 15.229: pc1 ICE candidate: candidate:1030387485 2 udp 2122063614 192.168.204.1 64124 typ host generation 0common.js:8 15.230: pc1 ICE candidate: candidate:3003979406 2 udp 2121998078 172.17.26.47 64125 typ host generation 0common.js:8 15.231: pc1 addIceCandidate successcommon.js:8 15.231: pc2 setLocalDescription completecommon.js:8 15.231: pc1 setRemoteDescription completecommon.js:8 15.231: pc1 addIceCandidate success8common.js:8 15.232: pc1 addIceCandidate success5common.js:8 15.233: pc1 addIceCandidate successcommon.js:8 15.233: pc2 ICE state: checkingmain.js:197 ICE state change event:  Event {isTrusted: true}common.js:8 15.243: pc2 ICE candidate: candidate:2999745851 1 udp 2122260223 192.168.56.1 64126 typ host generation 0common.js:8 15.246: pc2 ICE candidate: candidate:1425577752 1 udp 2122194687 172.17.22.106 64127 typ host generation 0common.js:8 15.247: pc2 ICE candidate: candidate:2733511545 1 udp 2122129151 192.168.127.1 64128 typ host generation 0common.js:8 15.248: pc2 ICE candidate: candidate:1030387485 1 udp 2122063615 192.168.204.1 64129 typ host generation 0common.js:8 15.249: pc2 ICE candidate: candidate:3003979406 1 udp 2121998079 172.17.26.47 64130 typ host generation 0common.js:8 15.250: pc1 ICE state: checkingmain.js:197 ICE state change event:  Event {isTrusted: true}5common.js:8 15.251: pc2 addIceCandidate successcommon.js:8 16.271: pc1 ICE state: connectedmain.js:197 ICE state change event:  Event {isTrusted: true}common.js:8 16.272: pc2 ICE state: connectedmain.js:197 ICE state change event:  Event {isTrusted: true}common.js:8 16.326: Remote video size changed to 640x480common.js:8 16.326: Setup time: 1142.795mscommon.js:8 16.326: Remote video videoWidth: 640px,  videoHeight: 480pxcommon.js:8 16.326: Remote video size changed to 640x480common.js:8 18.447: Ending call

 

 

但在真实世界,不可能不通过服务器传送信令,WebRTC 两端必须通过服务器交换信令。

  • 用户相互发现对方和交换“真实世界”的信息,如姓名。
  • WebRTC客户端应用程序交换网络信息。
  • 视频格式和分辨率等交换数据。
  • 客户端应用穿越NAT网关和防火墙。

所以,你的服务器端需要实现的功能:

  • 用户发现和通信。
  • 信令通信。
  • NAT和防火墙的穿越。
  • 在点对点通信失败后的中继服务(补救服务)。

STUN协议和它的扩展TURN使用ICE framework。

ICE先试图在节点直接连接,通过最低的延迟,通过UDP协议。在这个过程中:STUN服务器启用NAT后面找到它的公共地址和端口。

 

Finding connection candidates

 

ICE顺序:

  1. 先UDP,
  2. 如果UDP失败 则TCP,
  3. 如果TCP失败 则HTTP,
  4. 最后HTTPS

 

具体实现:

  1. ICE 先使用 STUN 通过UDP直连
  2. 如果UDP、TCP、http等失败 则使用TURN 中继服务(Relay server)

WebRTC data pathways

 

STUN, TURN and signaling 介绍:

 

许多现有的WebRTC应用只是Web浏览器之间的通信,但网关服务器可以使WebRTC应用在浏览器与设备如电话互动(PSTN和VoIP系统)。

2012五月,Doubango开源了sipml5 SIP客户端,sipml5是通过WebRTC和WebSocket,使视频和语音通话在浏览器和应用程序(iOS或Android)之间进行。

 

网址:

转载地址:http://pmvax.baihongyu.com/

你可能感兴趣的文章
一个简单好用的日志框架NLog
查看>>
超级硬盘数据恢复软件 4.6.5.0注冊码破解版
查看>>
一款基于jquery和css3实现的摩天轮式分享按钮
查看>>
Android创建启动画面
查看>>
Linux中date命令的各种实用方法--转载
查看>>
iOS: 为画板App增加 Undo/Redo(撤销/重做)操作
查看>>
<<APUE>> 线程的分离状态
查看>>
Hive创建外部表以及分区
查看>>
设置SVN忽略文件和文件夹(文件夹)
查看>>
IT项目管理-----给年轻工程师的十大忠告
查看>>
mysqld -install命令时出现install/remove of the service denied错误的原因和解决办法
查看>>
玩家游戏状态
查看>>
Android 小技巧-- TextView与EditText 同步显示
查看>>
苹果企业版帐号申请记录
查看>>
C++ Error: error LNK2019: unresolved external symbol
查看>>
Bitmap 和Drawable 的区别
查看>>
Java操作mongoDB2.6的常见API使用方法
查看>>
信息熵(Entropy)究竟是用来衡量什么的?
查看>>
如何给服务器设置邮件警报。
查看>>
iOS 数据库操作(使用FMDB)
查看>>