Why AVSampleBufferDisplayLayer stops showing CMSampleBuffers taken from AVCaptureVideoDataOutput's delegate?

血红的双手。 提交于 2019-12-20 14:45:07

问题


I want to display some CMSampleBuffer's with the AVSampleBufferDisplayLayer, but it freezes after showing the first sample.

I get the samplebuffers from the AVCaptureVideoDataOutputSampleBuffer delegate:

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    CFRetain(sampleBuffer);
    [self imageToBuffer:sampleBuffer];
    CFRelease(sampleBuffer);
}

put them into a vector

-(void) imageToBuffer: (CMSampleBufferRef )source{
//buffers is defined as: std::vector<CMSampleBufferRef> buffers;
        CMSampleBufferRef newRef;
        CMSampleBufferCreateCopy(kCFAllocatorDefault, source, &newRef);
        buffers.push_back(newRef);
}

Then try to show them via AVSampleBufferDisplayLayer (in another ViewController)

AVSampleBufferDisplayLayer * displayLayer = [[AVSampleBufferDisplayLayer alloc] init];

    displayLayer.bounds = self.view.bounds;
    displayLayer.position = CGPointMake(CGRectGetMidX(self.displayOnMe.bounds), CGRectGetMidY(self.displayOnMe.bounds));
    displayLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    displayLayer.backgroundColor = [[UIColor greenColor] CGColor];

    [self.view.layer addSublayer:displayLayer];
    self.view.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight;

    dispatch_queue_t queue = dispatch_queue_create("My queue", DISPATCH_QUEUE_SERIAL);
    [displayLayer setNeedsDisplay];
    [displayLayer requestMediaDataWhenReadyOnQueue:queue
                                        usingBlock:^{
                                            while ([displayLayer isReadyForMoreMediaData]) {

                                                if (samplesKey < buffers.size()) {
                                                    CMSampleBufferRef buf = buffers[samplesKey];
                                                    [displayLayer enqueueSampleBuffer:buffers[samplesKey]];
                                                    samplesKey++;

                                                }else
                                                {
                                                    [displayLayer stopRequestingMediaData];
                                                    break;
                                                }
                                            }

                                        }];

but it shows the first sample then freezes, and does nothing.

And my video data output settings are as follows:

//set up our output
self.videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
dispatch_queue_t queue = dispatch_queue_create("VideoQueue", DISPATCH_QUEUE_SERIAL);
[_videoDataOutput setSampleBufferDelegate:self queue:queue];
[_videoDataOutput setVideoSettings:[NSDictionary dictionaryWithObjectsAndKeys:
                                                [NSNumber numberWithInt:kCVPixelFormatType_32BGRA],(id)kCVPixelBufferPixelFormatTypeKey,
                                                nil]]; 

回答1:


I came across this problem in the same context, trying to take the output from AVCaptureVideoDataOutput and display it in a AVSampleDisplay layer.

If your frames come out in display order, then the fix is very easy, just set the display immediately flag on the CMSampleBufferRef.

Get the sample buffer returned by the delegate and then...

CFArrayRef attachments = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, YES);
CFMutableDictionaryRef dict = (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachments, 0);

CFDictionarySetValue(dict, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanTrue);

If your frames come out in encoder order (not display order), then the time stamps on the CMSampleBuffer need to be zero biased and restamped such that the first frames timestamp is equal to time 0.

 double pts = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer));

 // ptsStart is equal to the first frames presentationTimeStamp so playback starts from time 0.
 CMTime presentationTimeStamp = CMTimeMake((pts-ptsStart)*1000000,1000000);

 CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer, presentationTimeStamp);

Update:

I ran into a situation where some video still wasn't playing smoothly when I used the zero bias method and I investigated further. The correct answer seems to be using the PTS from the first frame you intend to play.

My answer is here, but I will post it here, too.

Set rate at which AVSampleBufferDisplayLayer renders sample buffers

The Timebase needs to be set to the presentation time stamp (pts) of the first frame you intend to decode. I was indexing the pts of the first frame to 0 by subtracting the initial pts from all subsequent pts and setting the Timebase to 0. For whatever reason, that didn't work with certain video.

You want something like this (called before a call to decode):

CMTimebaseRef controlTimebase;
CMTimebaseCreateWithMasterClock( CFAllocatorGetDefault(), CMClockGetHostTimeClock(), &controlTimebase );

displayLayer.controlTimebase = controlTimebase;

// Set the timebase to the initial pts here
CMTimebaseSetTime(displayLayer.controlTimebase, CMTimeMake(ptsInitial, 1));
CMTimebaseSetRate(displayLayer.controlTimebase, 1.0);

Set the PTS for the CMSampleBuffer...

CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer, presentationTimeStamp);

And maybe make sure display immediately isn't set....

CFDictionarySetValue(dict, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanFalse);

This is covered very briefly in WWDC 2014 Session 513.



来源:https://stackoverflow.com/questions/28700206/why-avsamplebufferdisplaylayer-stops-showing-cmsamplebuffers-taken-from-avcaptur

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!