rtsp:// liveStream with AVPlayer

两盒软妹~` 提交于 2021-01-27 10:33:05

问题


I want to play liveStream on iPhoneDevice with AVPlayer. Also i want to get CVPixelBufferRef from this stream for next usage.

I use Apple guide for creating player. Currently with locally stored videoFiles this player works just fine, also when i try to play this AppleSampleStremURL - http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8 - its work sine too.

Problems appear when i want to play stream with rtsp:// like this one: rtsp://192.192.168.1:8227/TTLS/Streaming/channels/2?videoCodecType=H.264

An code - almost all done using guid provided by Apple, but anyway:

Prepare asset for playing

- (void)initialSetupWithURL:(NSURL *)url
{
    NSDictionary *assetOptions = @{ AVURLAssetPreferPreciseDurationAndTimingKey : @YES,
                                    AVURLAssetReferenceRestrictionsKey : @(AVAssetReferenceRestrictionForbidNone)};
    self.urlAsset = [AVURLAsset URLAssetWithURL:url options:assetOptions];
}

Prepare player

- (void)prepareToPlay
{
    NSArray *keys = @[@"tracks"];
    __weak SPHVideoPlayer *weakSelf = self;
    [weakSelf.urlAsset loadValuesAsynchronouslyForKeys:keys completionHandler:^{
        dispatch_async(dispatch_get_main_queue(), ^{
            [weakSelf startLoading];
        });
    }];
}

- (void)startLoading
{
    NSError *error;
    AVKeyValueStatus status = [self.urlAsset statusOfValueForKey:@"tracks" error:&error];
    if (status == AVKeyValueStatusLoaded) {
        self.assetDuration = CMTimeGetSeconds(self.urlAsset.duration);
        NSDictionary* videoOutputOptions = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)};
        self.videoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:videoOutputOptions];
        self.playerItem = [AVPlayerItem playerItemWithAsset: self.urlAsset];

        [self.playerItem addObserver:self
                          forKeyPath:@"status"
                             options:NSKeyValueObservingOptionInitial
                             context:&ItemStatusContext];
        [self.playerItem addObserver:self
                          forKeyPath:@"loadedTimeRanges"
                             options:NSKeyValueObservingOptionNew|NSKeyValueObservingOptionOld
                             context:&ItemStatusContext];
        [[NSNotificationCenter defaultCenter] addObserver:self
                                                 selector:@selector(playerItemDidReachEnd:)
                                                     name:AVPlayerItemDidPlayToEndTimeNotification
                                                   object:self.playerItem];
        [[NSNotificationCenter defaultCenter] addObserver:self
                                                 selector:@selector(didFailedToPlayToEnd)
                                                     name:AVPlayerItemFailedToPlayToEndTimeNotification
                                                   object:nil];

        [self.playerItem addOutput:self.videoOutput];

        self.assetPlayer = [AVPlayer playerWithPlayerItem:self.playerItem];

        [self addPeriodicalObserver];
        NSLog(@"Player created");
    } else {
        NSLog(@"The asset's tracks were not loaded:\n%@", error.localizedDescription);
    }
}

Problems appear here - AVKeyValueStatus status = [self.urlAsset statusOfValueForKey:@"tracks" error:&error]; - this line with rtsp:// URL return AVKeyValueStatusFailed

with Error:

Printing description of error:
Error Domain=AVFoundationErrorDomain Code=-11800 
"The operation could not be completed" 
UserInfo=0x7fd1ea5a8a10 {NSLocalizedFailureReason=An unknown error occurred (-12936), 
NSLocalizedDescription=The operation could not be completed,
NSURL=rtsp://192.168.1.168:8556/PSIA/Streaming/channels/2?videoCodecType=H.264,
NSUnderlyingError=0x7fd1ea53f830 "The operation couldn’t be completed.
(OSStatus error -12936.)"}

I also looked for questions:

  1. FirstOne - try to use for this stream sample app from Apple - StitchedStreamPlayer - but get few different error for streams that i want to play there
  2. SecondOne - try to use both suggestion - fileURLWithPath return incorrect URL like : rtsp://192.168.1.168:8556/PSIA/Streaming/channels/2?videoCodecType=H.264 --file:/// - so it's i guess incorrect
  3. According to AppleDev try to create AvPlayer with diff approaches: like [AVPlayer playerWithPlayerItem:playerItem] and like [AVPlayer playerWithURL:url] - nothing changed. Also try to setup different setting for AVAsset - in initialSetupWithURL (see method implementation above).

So, question is AVPlayer support playing rtsp:// stream? If yes, can someone provide sample of correct usage? Nad what i'm doing wrong in code? If AvPlayer not support rtsp:// maybe exist some alternative solution?


回答1:


I don't find a way how to do rtsp streaming via AVURLAsset but good start point can be found here . Maybe it will be useful for someOne




回答2:


Have you tried with MobileVLCKit? It's really easy and work well! I wrote a small example here

If you want to try it, just type pod try ONVIFCamera in your terminal.

Here is how to do it:

var mediaPlayer = VLCMediaPlayer()

// Associate the movieView to the VLC media player
mediaPlayer.drawable = self.movieView

let url = URL(string: "rtsp://IP_ADDRESS:PORT/params")
let media = VLCMedia(url: url)
mediaPlayer.media = media

mediaPlayer.play()



回答3:


Basically it is possible to segment rtsp stream in to small mp4 containers, and push the containers into AVPlayer using customized URLAsset. Here is an experiment, it is still need a work for smooth transition between chunks, but like an idea here https://github.com/MaximKomlev/RTSPAVPlayer



来源:https://stackoverflow.com/questions/29371400/rtsp-livestream-with-avplayer

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!