avassetwriter

create video from images of camera roll - iOS sdk

情到浓时终转凉″ 提交于 2019-12-02 03:46:27
问题 I have used following code to create video from the images. This code works fine when i select the image from camera roll which is downloaded from web or the screenshot but the image selected which are taken from camera shows zoomed in in the movie. I don't know what is wrong with the images of camera. can anyone please help me resolve this issue. -(IBAction)createV:(id)sender { NSString *documentsDirectory = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"]; NSString

create video from images of camera roll - iOS sdk

谁都会走 提交于 2019-12-02 01:03:06
I have used following code to create video from the images. This code works fine when i select the image from camera roll which is downloaded from web or the screenshot but the image selected which are taken from camera shows zoomed in in the movie. I don't know what is wrong with the images of camera. can anyone please help me resolve this issue. -(IBAction)createV:(id)sender { NSString *documentsDirectory = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"]; NSString *videoOutputPath = [documentsDirectory stringByAppendingPathComponent:@"test_output.mp4"]; CGSize imageSize =

avassetwriter with greenscreen or chromakey

不想你离开。 提交于 2019-12-01 12:08:35
Is it possible to composite green screen images -- an animated actor against a green background, with a backdrop photo and make a video of that using avassetwriter on the iPhone. I have an application that creates a sequence of screenshots of an animated character against a green background. I'd like to composite those with a photograph from their library. Is there some way to composite the two into a video on the iPhone? Thanks, Yes, there is. I just added a chroma key filter to my GPUImage framework, which should let you do realtime green screen effects from camera, image, or movie sources.

Simulate AVLayerVideoGravityResizeAspectFill: crop and center video to mimic preview without losing sharpness

感情迁移 提交于 2019-12-01 12:04:17
Based on this SO post , the code below rotates, centers, and crops a video captured live by the user. The capture session uses AVCaptureSessionPresetHigh for the preset value, and the preview layer uses AVLayerVideoGravityResizeAspectFill for video gravity. This preview is extremely sharp. The exported video, however, is not as sharp, ostensibly because scaling from the 1920x1080 resolution for the back camera on the 5S to 320x568 (target size for the exported video) introduces fuzziness from throwing away pixels? Assuming there is no way to scale from 1920x1080 to 320x568 without some

Possible for AVAssetWriter to write files with transparency?

删除回忆录丶 提交于 2019-12-01 10:52:52
Every file I write with AVAssetWriter has a black background, if the images I include do not fill the entire render area. Is there any way to write with transparency? Here's the method I use to get the pixel buffer: - (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image { CGSize size = self.renderSize; NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil]; CVPixelBufferRef pxbuffer = NULL; CVPixelBufferPoolCreatePixelBuffer

Help me understand CMTime in AVAssetWriter

大憨熊 提交于 2019-12-01 01:35:20
I'm having a hard time understanding how to convert a stream of motion JPEG at 30fps using the AVAssetWriter to a video file. The part I'm not getting is the [adaptor appendPixelBuffer:buffer withPresentationTimeresentTime] method. How do I calculate the withPresentationTime value if I want to output 30fps mpeg4 video? The video source is a camera that streams 30fps motion JPEG in real time. Appreciate any idea. Thanks Steve McFarlin You will need to generate a CMTime structure using CMTimeMake. You will need to increment the time by 1/30 of a second for each frame. Here is a sketch: CMTime

AVCaptureSession only got video buffer

为君一笑 提交于 2019-11-30 23:23:20
I am trying to capture the video and audio from iphone camera and output as a video file by avassetwriter, but the output video file only contains the first frame with audio. I have inspected AVCaptureSession delegate method, - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { it seems only the delegate method only got one video sample buffer at the first, then receive audio sample buffer all the time like follow log. - Video SampleBuffer captured! - Audio SampleBuffer captured! - Audio

AVAssetWriterInput and readyForMoreMediaData

谁说胖子不能爱 提交于 2019-11-30 20:10:33
Is AVAssetWriterInput's readyForMoreMediaData updated in a background thread? If readyForMoreMediaData is NO, can I block in the main thread and wait until the value changes to YES? I'm using an AVAssetWriterInput by pushing data to it (i.e. without using requestMediaDataWhenReadyOnQueue) and I've set expectsMediaDataInRealTime, and 99.9% of the time I can just call appendSampleBuffer (or appendPixelBuffer) on it as fast as my app can generate frames. This works fine unless you put the device (iPhone 3GS) to sleep for 15 minutes or so in the middle of an AVAssetWriter session. After waking up

Help me understand CMTime in AVAssetWriter

不羁的心 提交于 2019-11-30 19:05:23
问题 I'm having a hard time understanding how to convert a stream of motion JPEG at 30fps using the AVAssetWriter to a video file. The part I'm not getting is the [adaptor appendPixelBuffer:buffer withPresentationTimeresentTime] method. How do I calculate the withPresentationTime value if I want to output 30fps mpeg4 video? The video source is a camera that streams 30fps motion JPEG in real time. Appreciate any idea. Thanks 回答1: You will need to generate a CMTime structure using CMTimeMake. You

Change AVMetadataItem

江枫思渺然 提交于 2019-11-30 14:34:56
I have some files, -m4a -mp4 -mp3 etc. I want to change these details AVMetadataItem AVMetadataCommonKeyArtwork AVMetadataCommonKeyArtist I can do with AVAssetExportSession , But I need to change the directory. Is there a way I can write directly on the file please? I found this program, but does not work :( NSError *error; AVAssetWriter *assetWrtr = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:self.path] fileType:AVFileTypeAppleM4A error:&error]; NSLog(@"%@",error); NSArray *existingMetadataArray = assetWrtr.metadata; NSMutableArray *newMetadataArray = nil; if