cmsamplebufferref

iOS - Automatically resize CVPixelBufferRef

旧时模样 提交于 2019-12-06 05:51:18
问题 I am trying to crop and scale a CMSampleBufferRef based on user's inputs, on ratio , the below code takes a CMSampleBufferRef, convert it into a CVImageBufferRef and use CVPixelBuffer to crop the internal image based on its bytes. The goal of this process is to have a cropped and scaled CVPixelBufferRef to write to the video - (CVPixelBufferRef)modifyImage:(CMSampleBufferRef) sampleBuffer { @synchronized (self) { CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); //

Using AVAssetWriter with raw NAL Units

随声附和 提交于 2019-12-05 21:21:17
问题 I noticed in the iOS documentation for AVAssetWriterInput you can pass nil for the outputSettings dictionary to specify that the input data should not be re-encoded. The settings used for encoding the media appended to the output. Pass nil to specify that appended samples should not be re-encoded. I want to take advantage of this feature to pass in a stream of raw H.264 NALs, but I am having trouble adapting my raw byte streams into a CMSampleBuffer that I can pass into AVAssetWriterInput's

Core Image - rendering a transparent image on CMSampleBufferRef result in black box around it

。_饼干妹妹 提交于 2019-12-05 01:23:09
问题 I'm trying to add a watermark/logo on a video that I'm recording using AVFoundation's AVCaptureVideoDataOutput. My class is set as the sampleBufferDelegate and receives the CMSamplebufferRefs. I already apply some effects to the CMSampleBufferRefs CVPixelBuffer and pass it back to the AVAssetWriter. The logo in the top left corner is delivered using a transparent PNG. The problem I'm having is that the transparent parts of the UIImage are black once written to the video. Anyone have an idea

iOS - Automatically resize CVPixelBufferRef

泪湿孤枕 提交于 2019-12-04 11:25:28
I am trying to crop and scale a CMSampleBufferRef based on user's inputs, on ratio , the below code takes a CMSampleBufferRef, convert it into a CVImageBufferRef and use CVPixelBuffer to crop the internal image based on its bytes. The goal of this process is to have a cropped and scaled CVPixelBufferRef to write to the video - (CVPixelBufferRef)modifyImage:(CMSampleBufferRef) sampleBuffer { @synchronized (self) { CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); // Lock the image buffer CVPixelBufferLockBaseAddress(imageBuffer,0); // Get information about the image

Using AVAssetWriter with raw NAL Units

家住魔仙堡 提交于 2019-12-04 03:18:02
I noticed in the iOS documentation for AVAssetWriterInput you can pass nil for the outputSettings dictionary to specify that the input data should not be re-encoded. The settings used for encoding the media appended to the output. Pass nil to specify that appended samples should not be re-encoded. I want to take advantage of this feature to pass in a stream of raw H.264 NALs, but I am having trouble adapting my raw byte streams into a CMSampleBuffer that I can pass into AVAssetWriterInput's appendSampleBuffer method. My stream of NALs contains only SPS/PPS/IDR/P NALs (1, 5, 7, 8). I haven't

Split CMSampleBufferRef containing Audio

大兔子大兔子 提交于 2019-12-03 22:25:03
问题 I'am splitting the recording into different files while recording... The problem is, captureOutput video and audio sample buffers doesn't correspond 1:1 (which is logical) - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection AUDIO START: 36796.833236847 | DURATION: 0.02321995464852608 | END: 36796.856456802 VIDEO START: 36796.842089239 | DURATION: nan | END: nan AUDIO START: 36796

Capture still UIImage without compression (from CMSampleBufferRef)?

▼魔方 西西 提交于 2019-12-03 17:28:28
I need to obtain the UIImage from uncompressed image data from CMSampleBufferRef. I'm using the code: captureStillImageOutput captureStillImageAsynchronouslyFromConnection:connection completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) { // that famous function from Apple docs found on a lot of websites // does NOT work for still images UIImage *capturedImage = [self imageFromSampleBuffer:imageSampleBuffer]; } http://developer.apple.com/library/ios/#qa/qa1702/_index.html is a link to imageFromSampleBuffer function. But it does not work properly. :( There is a

How to set timestamp of CMSampleBuffer for AVWriter writing

一笑奈何 提交于 2019-12-03 13:04:33
问题 I'm working with AVFoundation for capturing and recording audio. There are some issues I don't quite understand. Basically I want to capture audio from AVCaptureSession and write it using AVWriter, however I need some shifting in the timestamp of the CMSampleBuffer I get from AVCaptureSession. I read documentation of CMSampleBuffer I see two different term of timestamp: 'presentation timestamp' and 'output presentation timestamp'. What the different of the two ? Let say I get a CMSampleBuffer

CMSampleBufferRef kCMSampleBufferAttachmentKey_TrimDurationAtStart crash

谁都会走 提交于 2019-12-01 22:41:24
This has bothering me for a while. i have video convert to convert video into “.mp4” format. But there is a crash that happens on some video but not all. here is the crash log *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVAssetWriterInput appendSampleBuffer:] Cannot append sample buffer: First input buffer must have an appropriate kCMSampleBufferAttachmentKey_TrimDurationAtStart since the codec has encoder delay' here is my codes: NSURL *uploadURL = [NSURL fileURLWithPath:[[NSTemporaryDirectory() stringByAppendingPathComponent:[self getVideoName]]

Memory Leak in CMSampleBufferGetImageBuffer

孤街醉人 提交于 2019-12-01 10:37:51
I'm getting a UIImage from a CMSampleBufferRef video buffer every N video frames like: - (void)imageFromVideoBuffer:(void(^)(UIImage* image))completion { CMSampleBufferRef sampleBuffer = _myLastSampleBuffer; if (sampleBuffer != nil) { CFRetain(sampleBuffer); CIImage *ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)]; _lastAppendedVideoBuffer.sampleBuffer = nil; if (_context == nil) { _context = [CIContext contextWithOptions:nil]; } CVPixelBufferRef buffer = CMSampleBufferGetImageBuffer(sampleBuffer); CGImageRef cgImage = [_context createCGImage:ciImage