AVCapture appendSampleBuffer

后端 未结 2 1031
一向
一向 2021-02-01 10:46

I am going insane with this one - have looked everywhere and tried anything and everything I can thinks of.

Am making an iPhone app that uses AVFoundation - specifically

2条回答
  •  情书的邮戳
    2021-02-01 11:26

    You need AVAssetWriterInputPixelBufferAdaptor, here is the code to create it :

        // Create dictionary for pixel buffer adaptor
    NSDictionary *bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey, nil];
    
    // Create pixel buffer adaptor
    m_pixelsBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:assetWriterInput sourcePixelBufferAttributes:bufferAttributes];
    

    And the code to use it :

    // If ready to have more media data
    if (m_pixelsBufferAdaptor.assetWriterInput.readyForMoreMediaData) {
        // Create a pixel buffer
        CVPixelBufferRef pixelsBuffer = NULL;
        CVPixelBufferPoolCreatePixelBuffer(NULL, m_pixelsBufferAdaptor.pixelBufferPool, &pixelsBuffer);
    
        // Lock pixel buffer address
        CVPixelBufferLockBaseAddress(pixelsBuffer, 0);
    
        // Create your function to set your pixels data in the buffer (in your case, fill with your finalImage data)
        [self yourFunctionToPutDataInPixelBuffer:CVPixelBufferGetBaseAddress(pixelsBuffer)];
    
        // Unlock pixel buffer address
        CVPixelBufferUnlockBaseAddress(pixelsBuffer, 0);
    
        // Append pixel buffer (calculate currentFrameTime with your needing, the most simplest way is to have a frame time starting at 0 and increment each time you write a frame with the time of a frame (inverse of your framerate))
        [m_pixelsBufferAdaptor appendPixelBuffer:pixelsBuffer withPresentationTime:currentFrameTime];
    
        // Release pixel buffer
        CVPixelBufferRelease(pixelsBuffer);
    }
    

    And don't forget to release your pixelsBufferAdaptor.

提交回复
热议问题