WebRTC java server trouble

跟風遠走 提交于 2019-12-18 22:33:08

问题


I think I am very close to getting my Java server app to talk to a browser page via WebRTC, but I can't quite get it to work. I feel like I am missing something small, so I hope that someone here might have a suggestion.

I took a close look at the WebRTC examples - the Java unit test (org.webrtc.PeerConnectionTest), and the example Android app (trunk/talk/examples/android). Based on what I learned, I put together a java app that uses WebSockets for signalling and attempts to send a video stream to Chrome.

The problem is that there is no video in the browser, even though all of my code (both Javascript and Java) is executed in the order I expect it to, hitting all the correct logging statements. There is some suspicious output in the console log from the native libjingle code, but I am not sure what to make of it. I have highlighted suspicious lines in the log with '>>' below. For example, it seems that the video port allocators are destroyed shortly after being created, so something is obviously wrong. Also, "Changing video state, recv=1 send=0" seems incorrect as well, since the Java side should be sending the video, rather than receiving.... Perhaps I am misusing the OfferToReceiveVideo option?

If you look at the logs below, you'll see that the WebSocket communication with the browser is working perfectly, and that I am able to successfully send the SDP Offer to the browser and receive an SDP Answer from the browser. Setting the Local and Remote Descriptions on the PeerConnections seems to work properly as well. The HTML5 video element gets the source set to a BLOB url, as it should. So, what could I be missing? Do I need to do anything with ICE candidates, even though my client and server are on the same machine right now?

Any advice would be greatly appreciated!

SDP Messages (from Chrome's Javascript Console)

1.134: Java Offer: 
v=0
o=- 5893945934600346864 2 IN IP4 127.0.0.1
s=-
t=0 0
a=group:BUNDLE video
a=msid-semantic: WMS JavaMediaStream
m=video 1 RTP/SAVPF 100 116 117
c=IN IP4 0.0.0.0
a=rtcp:1 IN IP4 0.0.0.0
a=ice-ufrag:dJxTlMlXy7uASrDU
a=ice-pwd:r8BRkXVnc4dqCABUDhuRjpp7
a=ice-options:google-ice
a=mid:video
a=extmap:2 urn:ietf:params:rtp-hdrext:toffset
a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
a=sendrecv
a=rtcp-mux
a=crypto:0 AES_CM_128_HMAC_SHA1_80 inline:yq6wOHhk/QfsWuh+1oOEqfB4GjKZzz8XfQnGCDP3
a=rtpmap:100 VP8/90000
a=rtcp-fb:100 ccm fir
a=rtcp-fb:100 nack
a=rtcp-fb:100 goog-remb
a=rtpmap:116 red/90000
a=rtpmap:117 ulpfec/90000
a=ssrc:3720473526 cname:nul6R21KmwAms3Ge
a=ssrc:3720473526 msid:JavaMediaStream JavaMediaStream_v0
a=ssrc:3720473526 mslabel:JavaMediaStream
a=ssrc:3720473526 label:JavaMediaStream_v0


1.149: Received remote stream


1.150: Browsers Answer: 
v=0
o=- 4261396844048664099 2 IN IP4 127.0.0.1
s=-
t=0 0
a=group:BUNDLE video
a=msid-semantic: WMS
m=video 1 RTP/SAVPF 100 116 117
c=IN IP4 0.0.0.0
a=rtcp:1 IN IP4 0.0.0.0
a=ice-ufrag:quzQNsX+ZlUWUQqV
a=ice-pwd:y5A0+7sM8P88AatBLd1fdd5G
a=mid:video
a=extmap:2 urn:ietf:params:rtp-hdrext:toffset
a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
a=recvonly
a=rtcp-mux
a=crypto:0 AES_CM_128_HMAC_SHA1_80 inline:WClNA69OfpjdJy3Bv4ujejk/IYnn4DW8kjrB18xP
a=rtpmap:100 VP8/90000
a=rtcp-fb:100 ccm fir
a=rtcp-fb:100 nack
a=rtcp-fb:100 goog-remb
a=rtpmap:116 red/90000
a=rtpmap:117 ulpfec/90000

This seems ok to me. Java's offer includes my video stream.

Native code logging (libjingle)

(Suspicious lines marked with '>>')

Camera '/dev/video0' started with format YUY2 640x480x30, elapsed time 59 ms
Ignored line: c=IN IP4 0.0.0.0
NACK enabled for channel 0
NACK enabled for channel 0
Created channel for video
Jingle:Channel[video|1|__]: NULL DTLS identity supplied. Not doing DTLS
Jingle:Channel[video|2|__]: NULL DTLS identity supplied. Not doing DTLS
Session:5893945934600346864 Old state:STATE_INIT New state:STATE_SENTINITIATE Type:urn:xmpp:jingle:apps:rtp:1 Transport:http://www.google.com/transport/p2p
Setting local video description
AddSendStream {id:JavaMediaStream_v0;ssrcs:[3720473526];ssrc_groups:;cname:nul6R21KmwAms3Ge;sync_label:JavaMediaStream}
Add send ssrc: 3720473526
>> Warning(webrtcvideoengine.cc:2704): SetReceiverBufferingMode(0, 0) failed, err=12606
Changing video state, recv=0 send=0
Transport: video, allocating candidates
Transport: video, allocating candidates
Jingle:Net[eth0:192.168.0.0/24]: Allocation Phase=Udp
Jingle:Port[:1:0::Net[eth0:192.168.0.0/24]]: Port created
Adding allocated port for video
Jingle:Port[video:1:0::Net[eth0:192.168.0.0/24]]: Added port to allocator
Jingle:Net[tun0:192.168.128.6/32]: Allocation Phase=Udp
Jingle:Port[:1:0::Net[tun0:192.168.128.6/32]]: Port created
Adding allocated port for video
Jingle:Port[video:1:0::Net[tun0:192.168.128.6/32]]: Added port to allocator
Ignored line: c=IN IP4 0.0.0.0
Warning(webrtcvideoengine.cc:2309): GetStats: sender information not ready.
Jingle:Channel[video|1|__]: Other side didn't support DTLS.
Jingle:Channel[video|2|__]: Other side didn't support DTLS.
Enabling BUNDLE, bundling onto transport: video
Channel enabled
>> Changing video state, recv=1 send=0
Session:5893945934600346864 Old state:STATE_SENTINITIATE New state:STATE_RECEIVEDACCEPT Type:urn:xmpp:jingle:apps:rtp:1 Transport:http://www.google.com/transport/p2p
Setting remote video description
Hybrid NACK/FEC enabled for channel 0
Hybrid NACK/FEC enabled for channel 0
SetSendCodecs() : selected video codec VP8/1280x720x30fps@2000kbps (min=50kbps, start=300kbps)
Video max quantization: 56
VP8 number of temporal layers: 1
VP8 options : picture loss indication = 0, feedback mode = 0, complexity = normal, resilience = off, denoising = 0, error concealment = 0, automatic resize = 0, frame dropping = 1, key frame interval = 3000
WARNING: no real random source present!
SRTP activated with negotiated parameters: send cipher_suite AES_CM_128_HMAC_SHA1_80 recv cipher_suite AES_CM_128_HMAC_SHA1_80
Changing video state, recv=1 send=0
Session:5893945934600346864 Old state:STATE_RECEIVEDACCEPT New state:STATE_INPROGRESS Type:urn:xmpp:jingle:apps:rtp:1 Transport:http://www.google.com/transport/p2p
Jingle:Net[eth0:192.168.0.0/24]: Allocation Phase=Relay
Jingle:Net[tun0:192.168.128.6/32]: Allocation Phase=Relay
Jingle:Net[eth0:192.168.0.0/24]: Allocation Phase=Tcp
Jingle:Port[:1:0:local:Net[eth0:192.168.0.0/24]]: Port created
Adding allocated port for video
Jingle:Port[video:1:0:local:Net[eth0:192.168.0.0/24]]: Added port to allocator
Jingle:Net[tun0:192.168.128.6/32]: Allocation Phase=Tcp
Jingle:Port[:1:0:local:Net[tun0:192.168.128.6/32]]: Port created
Adding allocated port for video
Jingle:Port[video:1:0:local:Net[tun0:192.168.128.6/32]]: Added port to allocator
Jingle:Net[eth0:192.168.0.0/24]: Allocation Phase=SslTcp
Jingle:Net[tun0:192.168.128.6/32]: Allocation Phase=SslTcp
All candidates gathered for video:1:0
Transport: video, component 1 allocation complete
Transport: video allocation complete
Candidate gathering is complete.
Capture delay changed to 120 ms
Captured frame size 640x480. Expected format YUY2 640x480x30
Capture size changed : selected video codec VP8/640x480x30fps@2000kbps (min=50kbps, start=300kbps)
Video max quantization: 56
VP8 number of temporal layers: 1
VP8 options : picture loss indication = 0, feedback mode = 0, complexity = normal, resilience = off, denoising = 0, error concealment = 0, automatic resize = 1, frame dropping = 1, key frame interval = 3000
VAdapt Frame: 0 / 300 Changes: 0 Input: 640x480 Scale: 1 Output: 640x480 Changed: false
>> Jingle:Port[video:1:0::Net[eth0:192.168.0.0/24]]: Port deleted
>> Jingle:Port[video:1:0::Net[eth0:192.168.0.0/24]]: Removed port from allocator (3 remaining)
Removed port from p2p socket: 3 remaining
Jingle:Port[video:1:0::Net[tun0:192.168.128.6/32]]: Port deleted
Jingle:Port[video:1:0::Net[tun0:192.168.128.6/32]]: Removed port from allocator (2 remaining)
Removed port from p2p socket: 2 remaining
>> Jingle:Port[video:1:0:local:Net[eth0:192.168.0.0/24]]: Port deleted
>> Jingle:Port[video:1:0:local:Net[eth0:192.168.0.0/24]]: Removed port from allocator (1 remaining)
Removed port from p2p socket: 1 remaining
Jingle:Port[video:1:0:local:Net[tun0:192.168.128.6/32]]: Port deleted
Jingle:Port[video:1:0:local:Net[tun0:192.168.128.6/32]]: Removed port from allocator (0 remaining)
Removed port from p2p socket: 0 remaining

HTML

<html lang="en">
    <head>
        <title>Web Socket Signalling</title>
        <link rel="stylesheet" href="css/socket.css">
        <script src="js/socket.js"></script>
    </head>
    <body>
        <h2>Repsonse from Server</h2>
        <textarea id="responseText"></textarea>

        <h2>Video</h2>
        <video id="remoteVideo" autoplay></video>
    </body>
</html>

Javascript

(function() {
  var remotePeerConnection;
  var sdpConstraints = {
    'mandatory' : {
      'OfferToReceiveAudio' : false,
      'OfferToReceiveVideo' : true
    }
  };


  var Sock = function() {
    var socket;
    if (!window.WebSocket) {
      window.WebSocket = window.MozWebSocket;
    }


    if (window.WebSocket) {
      socket = new WebSocket("ws://localhost:8080/websocket");
      socket.onopen = onopen;
      socket.onmessage = onmessage;
      socket.onclose = onclose;
    } else {
      alert("Your browser does not support Web Socket.");
    }


    function onopen(event) {
      getTextAreaElement().value = "Web Socket opened!";
    }


    function onmessage(event) {
      appendTextArea(event.data);


      sdpOffer = new RTCSessionDescription(JSON.parse(event.data));


      remotePeerConnection = new webkitRTCPeerConnection(null);
      remotePeerConnection.onaddstream = gotRemoteStream;


      trace("Java Offer: \n" + sdpOffer.sdp);
      remotePeerConnection.setRemoteDescription(sdpOffer);
      remotePeerConnection.createAnswer(gotRemoteDescription, onCreateSessionDescriptionError, sdpConstraints);


    }
    function onCreateSessionDescriptionError(error) {
      console.log('Failed to create session description: '
          + error.toString());
    }

    function gotRemoteDescription(answer) {
      remotePeerConnection.setLocalDescription(answer);
      trace("Browser's Answer: \n" + answer.sdp);


      socket.send(JSON.stringify(answer));
    }


    function gotRemoteStream(event) {
      var remoteVideo = document.getElementById("remoteVideo");
      remoteVideo.src = URL.createObjectURL(event.stream);
      trace("Received remote stream");
    }


    function onclose(event) {
      appendTextArea("Web Socket closed");
    }


    function appendTextArea(newData) {
      var el = getTextAreaElement();
      el.value = el.value + '\n' + newData;
    }


    function getTextAreaElement() {
      return document.getElementById('responseText');
    }


    function trace(text) {
      console.log((performance.now() / 1000).toFixed(3) + ": " + text);
    }


  }
  window.addEventListener('load', function() {
    new Sock();
  }, false);
})();

Java Server

public class PeerConnectionManager {

   /**
    * Called when the WebSocket handshake is completed
    */
   public void createOffer() {

      peerConnection = factory.createPeerConnection(
            new ArrayList<PeerConnection.IceServer>(),
            new MediaConstraints(), 
            new PeerConnectionObserverImpl());


      // Get the video source
      videoSource = factory.createVideoSource(VideoCapturer.create(""), new MediaConstraints());


      // Create a MediaStream with one video track
      MediaStream lMS = factory.createLocalMediaStream("JavaMediaStream");
      VideoTrack videoTrack = factory.createVideoTrack("JavaMediaStream_v0", videoSource);
      videoTrack.addRenderer(new VideoRenderer(new VideoRendererObserverImpl()));
      lMS.addTrack(videoTrack);
      peerConnection.addStream(lMS, new MediaConstraints());

      // We don't want to receive anything
      MediaConstraints sdpConstraints = new MediaConstraints();
      sdpConstraints.mandatory.add(new MediaConstraints.KeyValuePair(
            "OfferToReceiveAudio", "false"));
      sdpConstraints.mandatory.add(new MediaConstraints.KeyValuePair(
            "OfferToReceiveVideo", "false"));


      // Get the Offer SDP
      SdpObserverImpl sdpOfferObserver = new SdpObserverImpl();
      peerConnection.createOffer(sdpOfferObserver, sdpConstraints);
      SessionDescription offerSdp = sdpOfferObserver.getSdp();

      // Set local SDP, don't care for any callbacks
      peerConnection.setLocalDescription(new SdpObserverImpl(), offerSdp);


      // Serialize Offer and send to the Browser via a WebSocket
      JSONObject offerSdpJson = new JSONObject();
      offerSdpJson.put("sdp", offerSdp.description);
      offerSdpJson.put("type", offerSdp.type.canonicalForm());
      webSocketContext.channel().writeAndFlush(
            new TextWebSocketFrame(offerSdpJson.toString()));


   }


   /**
    * Called when an SDP Answer arrives via the WebSocket
    */
   public void setRemoteDescription(SessionDescription answer) {
      peerConnection.setRemoteDescription( new SdpObserverImpl(), answer);

   }
}

回答1:


Ugh. Never mind. Sorry for the stupid question.

The missing part was the exchange of ICE candidates between the browser and Java server. Now that I added the code to do ICE negotiation via my WebSocket, everything works fine!



来源:https://stackoverflow.com/questions/20229599/webrtc-java-server-trouble

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!