avfoundation

Possible for AVAssetWriter to write files with transparency?

删除回忆录丶 提交于 2019-12-01 10:52:52
Every file I write with AVAssetWriter has a black background, if the images I include do not fill the entire render area. Is there any way to write with transparency? Here's the method I use to get the pixel buffer: - (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image { CGSize size = self.renderSize; NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil]; CVPixelBufferRef pxbuffer = NULL; CVPixelBufferPoolCreatePixelBuffer

How to record video with overlay view

让人想犯罪 __ 提交于 2019-12-01 10:45:46
Hi I am trying to record video with overlay. I have written: -(void)addOvelayViewToVideo:(NSURL *)videoURL to add overlay view on recorded video but it is not working. I written the code to record video in viewDidLoad using AVCaptureSession . //In ViewDidLoad //CONFIGURE DISPLAY OUTPUT self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession]; [self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill]; self.previewLayer.frame = self.view.frame; [self.view.layer addSublayer:self.previewLayer]; -(void)captureOutput:(AVCaptureFileOutput *)captureOutput

Memory problems with [AVAssetWriterInput requestMediaDataWhenReadyOnQueue:usingBlock:]

家住魔仙堡 提交于 2019-12-01 10:39:13
I’m writing a library to export assets to a file using AVFoundation. I create a reader, a writer, connect the inputs and outputs to these and then call the requestMediaDataWhenReadyOnQueue method on the inputs to start pulling the data. The block callback supplied to this method looks a bit like this: [input requestMediaDataWhenReadyOnQueue:queue usingBlock:^{ while ([input isReadyForMoreMediaData]) { CMSampleBufferRef buffer; // The track has some more data for us if ([reader status] == AVAssetReaderStatusReading && (buffer = [output copyNextSampleBuffer])) { BOOL result = [input

Memory Leak in CMSampleBufferGetImageBuffer

孤街醉人 提交于 2019-12-01 10:37:51
I'm getting a UIImage from a CMSampleBufferRef video buffer every N video frames like: - (void)imageFromVideoBuffer:(void(^)(UIImage* image))completion { CMSampleBufferRef sampleBuffer = _myLastSampleBuffer; if (sampleBuffer != nil) { CFRetain(sampleBuffer); CIImage *ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)]; _lastAppendedVideoBuffer.sampleBuffer = nil; if (_context == nil) { _context = [CIContext contextWithOptions:nil]; } CVPixelBufferRef buffer = CMSampleBufferGetImageBuffer(sampleBuffer); CGImageRef cgImage = [_context createCGImage:ciImage

AVPlayer not playing m3u8 from local file

谁说我不能喝 提交于 2019-12-01 09:37:20
I am trying to get AVPlayer to play a m3u8 playlist that is a local file. I have narrowed this down to a simple test case using one of Apple's sample playlists: https://tungsten.aaplimg.com/VOD/bipbop_adv_fmp4_example/master.m3u8 If i play this playlist from the remote url, AVPlayer plays this fine. However, if i download this playlist to a local file, and then hand AVPlayer the local file URL, AVPlayer will not play it. It just shows the crossed out play symbol. Interestingly enough, this can be duplicated with Safari as well. Safari will play the remote playlist but not the local file. Also

SCNMaterialProperty not rendering layer

被刻印的时光 ゝ 提交于 2019-12-01 09:28:01
问题 SCNMaterialProperty's contents property on SCNMaterial is unable to render when assigned a AVPlayerLayer. Note this is only a issue on a physical device, works fine on the simulator (Xcode 6.0.1). I am creating my SCNode as such: SCNNode *videoBall = [SCNNode node]; videoBall.position = SCNVector3Make(-5, 5, -18); videoBall.geometry = [SCNSphere sphereWithRadius:5]; videoBall.geometry.firstMaterial.locksAmbientWithDiffuse = YES; videoBall.geometry.firstMaterial.diffuse.contents = [self

Memory Leak in CMSampleBufferGetImageBuffer

我与影子孤独终老i 提交于 2019-12-01 08:35:53
问题 I'm getting a UIImage from a CMSampleBufferRef video buffer every N video frames like: - (void)imageFromVideoBuffer:(void(^)(UIImage* image))completion { CMSampleBufferRef sampleBuffer = _myLastSampleBuffer; if (sampleBuffer != nil) { CFRetain(sampleBuffer); CIImage *ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)]; _lastAppendedVideoBuffer.sampleBuffer = nil; if (_context == nil) { _context = [CIContext contextWithOptions:nil]; } CVPixelBufferRef buffer =

Misaligned Sound Playback using AVPlayer and AVMutableComposition

大兔子大兔子 提交于 2019-12-01 07:30:23
问题 I'm trying to create a song from multiple instrument samples using AVComposition . When I play two sound assets at kCMTimeZero in an AVMutableComposition , I would expect them to play at the same time, but there is a very slight offset. This only happens on the first playthrough, so it would seem to be some kind of loading delay, but no matter what I try it doesn't go away. I've made sure to preload the sound asset tracks, preload the composition tracks, wait for the playerItem to be ready,

Is there any way to get frame by frame using AVCaptureSession object in swift?

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-01 07:24:53
问题 I have to process frames which are captured by iPhone camera using my c++ functions. So I use startRunning() function to start the flow of data, but in what way I can process each frame? 回答1: Yes, it is pretty straight forward. You need to Create an AVCaptureVideoDataOutput object to produce video frames Implement a delegate for the AVCaptureVideoDataOutput object to process video frames In the delegate class, implement the method (captureOutput:didOutputSampleBuffer:fromConnection:) that is

Memory problems with [AVAssetWriterInput requestMediaDataWhenReadyOnQueue:usingBlock:]

心不动则不痛 提交于 2019-12-01 07:05:27
问题 I’m writing a library to export assets to a file using AVFoundation. I create a reader, a writer, connect the inputs and outputs to these and then call the requestMediaDataWhenReadyOnQueue method on the inputs to start pulling the data. The block callback supplied to this method looks a bit like this: [input requestMediaDataWhenReadyOnQueue:queue usingBlock:^{ while ([input isReadyForMoreMediaData]) { CMSampleBufferRef buffer; // The track has some more data for us if ([reader status] ==