Custom video capture native webrtc

◇◆丶佛笑我妖孽 提交于 2020-06-29 04:38:18

问题


According to webrtc discuss group topic at google cricket::VideoCapture will be deprecated soon. To customize a video source, we should implement VideoTrackSourceInterface. I tried implementing the Interface and didn't work. I implemented the interface an when I have a frame then called the event OnFrame(const webrtc::VideoFrame& frame) as following:

void StreamSource::OnFrame(const webrtc::VideoFrame& frame)
{
 rtc::scoped_refptr<webrtc::VideoFrameBuffer buffer(frame.video_frame_buffer());
 broadcaster_.OnFrame(frame);

} In conductor.cc at the event AddStreams() I create a videosource by the following code :

rtc::scoped_refptr<webrtc::VideoTrackInterface> video_track(
peer_connection_factory_->CreateVideoTrack( kVideoLabel,new mystream::StreamSource()));

My video does not play in the browser. What I'm doing wrong?


回答1:


I used the base class AdaptedVideoTrackSource and I created a method FrameCaptured it's is called from my thread in this method I call the method OnFrame. It's work fine !!!

 class StreamSource : public rtc::AdaptedVideoTrackSource
 {
   void OnFrameCaptured(const webrtc::VideoFrame& frame);
 }

 void StreamSource::OnFrameCaptured(const webrtc::VideoFrame& frame) 
 {
  OnFrame(frame);
 }



回答2:


I get the answer in google group

VideoFrame have enum type, as:

class VideoFrameBuffer : public rtc::RefCountInterface {
 public:
  // New frame buffer types will be added conservatively when there is an
  // opportunity to optimize the path between some pair of video source and
  // video sink.
  enum class Type {
    kNative,
    kI420,
    kI420A,
    kI444,
    kI010,
  };
 ...
 }

then , when create a Videoframe, you set the type to kNative. if you find other good way , share it.




回答3:


To elaborate on user1658843's answer: create a custom video source class and define all the abstract methods. here is an example:

class CustomVideoSource : public rtc::AdaptedVideoTrackSource  {

public:
    void OnFrameCaptured(const webrtc::VideoFrame& frame);
    void AddRef() const override;
    rtc::RefCountReleaseStatus Release() const override;
    SourceState state() const override;
    bool remote() const override;
    bool is_screencast() const override;
    absl::optional<bool> needs_denoising() const override;
private:
    mutable volatile int ref_count_;
};

And implementations:

void CustomVideoSource::OnFrameCaptured(const webrtc::VideoFrame& frame) {
  OnFrame(frame);
}

void CustomVideoSource::AddRef() const {
  rtc::AtomicOps::Increment(&ref_count_);
}

rtc::RefCountReleaseStatus CustomVideoSource::Release() const {
  const int count = rtc::AtomicOps::Decrement(&ref_count_);
  if (count == 0) {
    return rtc::RefCountReleaseStatus::kDroppedLastRef;
  }
  return rtc::RefCountReleaseStatus::kOtherRefsRemained;
}

webrtc::MediaSourceInterface::SourceState CustomVideoSource::state() const {
  return kLive;
}

bool CustomVideoSource::remote() const {
  return false;
}

bool CustomVideoSource::is_screencast() const {
  return false;
}

absl::optional<bool> CustomVideoSource::needs_denoising() const {
  return false;

Keep in mind this is just to get it to work and not full implementations. You should implement the abstract methods properly instead of returning the hard coded values. To send a frame simply call OnFrameCaptured with the frame.

To add stream:

custom_source= new rtc::RefCountedObject<CustomVideoSource>();
// create video track from our custom source
rtc::scoped_refptr<webrtc::VideoTrackInterface> custom_video_track(
g_peer_connection_factory->CreateVideoTrack( kVideoLabel, custom_source));
//add to stream
stream->AddTrack(custom_video_track);

I'm not an expert but doing a project on my own and implementing stuff along the way.Feel free to correct me or add to this code.



来源:https://stackoverflow.com/questions/49131317/custom-video-capture-native-webrtc

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!