UIImage created from CMSampleBufferRef not displayed in UIImageView?

后端 未结 5 1501
难免孤独
难免孤独 2020-12-13 05:21

I\'m trying to display a UIImage in real-time coming from the camera, and it seems that my UIImageView is not displaying the image properly. This is the method which a

相关标签:
5条回答
  • 2020-12-13 05:29

    Ben Loulier has a good write up on how to do this.

    I am using his example app as a starting point, and it is working for me. Along with replacing the imageFromSamplerBuffer function with something that creates CGImageRef with CGBitmapContextCreate, he is using the main dispatch queue (via dispatch_get_main_queue()) when setting the output sample buffer delegate. This isn't the best solution because it needs a serial queue, and from what I understand the main queue is not a serial queue. So while you aren't guaranteed to get the frames in the correct order it seems to work for me so far :)

    0 讨论(0)
  • 2020-12-13 05:37

    Live capture of video frames is now well explained by Apple's Technical Q&A QA1702:

    https://developer.apple.com/library/ios/#qa/qa1702/_index.html

    0 讨论(0)
  • 2020-12-13 05:38

    Another thing to look out for is whether you're actually updating your UIImageView on the main thread: if you aren't, chances are it won't reflect any changes.

    captureOutput:didOutputSampleBuffer:fromConnection delegate method is often called on a background thread. So you want to do this:

    - (void)captureOutput:(AVCaptureOutput *)captureOutput
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
           fromConnection:(AVCaptureConnection *)connection
    {   
        CFRetain(sampleBuffer);
    
        [[NSOperationQueue mainQueue] addOperationWithBlock:^{
    
            //Now we're definitely on the main thread, so update the imageView:
            UIImage *capturedImage = [self imageFromSampleBuffer:sampleBuffer];
    
            //Display the image currently being captured:
            imageView.image = capturedImage;
    
            CFRelease(sampleBuffer);
        }];
    }
    
    0 讨论(0)
  • 2020-12-13 05:39

    I had the same problem ... but I found this old post, and its method of creating the CGImageRef works!

    http://forum.unity3d.com/viewtopic.php?p=300819

    Here's a working sample:

    app has a member UIImage theImage;
    
    // Delegate routine that is called when a sample buffer was written
    - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer  fromConnection:(AVCaptureConnection *)connection
    {
            //... just an example of how to get an image out of this ...
    
        CGImageRef cgImage = [self imageFromSampleBuffer:sampleBuffer];
        theImage.image =     [UIImage imageWithCGImage: cgImage ];
        CGImageRelease( cgImage );
    }
    
    - (CGImageRef) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer // Create a CGImageRef from sample buffer data
    {
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
        CVPixelBufferLockBaseAddress(imageBuffer,0);        // Lock the image buffer 
    
        uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);   // Get information of the image 
        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
        size_t width = CVPixelBufferGetWidth(imageBuffer); 
        size_t height = CVPixelBufferGetHeight(imageBuffer); 
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
    
        CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
        CGImageRef newImage = CGBitmapContextCreateImage(newContext); 
        CGContextRelease(newContext); 
    
        CGColorSpaceRelease(colorSpace); 
        CVPixelBufferUnlockBaseAddress(imageBuffer,0); 
        /* CVBufferRelease(imageBuffer); */  // do not call this!
    
        return newImage;
    }
    
    0 讨论(0)
  • 2020-12-13 05:44

    It is also important to set right output format. I had a problem with image capturing when used default format settings. It should be:

    [videoDataOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey]];
    
    0 讨论(0)
提交回复
热议问题