avfoundation

Need hints for using iPhone SDK's AVMutableVideoComposition

北战南征 提交于 2019-12-04 12:20:25
The AVFoundation framework provides the AVMutableVideoComposition class (the mutable variant of AVVideoComposition). It looks like you can render CoreAnimations directly to an instance of this class to create a video but I don't know how to save the composition to a file or how to work with it at all, really. The following code called from a UIViewController appears to work to create the composition and the animation but, then, well, I'm stumped as to how to work with the composition. Any help or guidance is greatly appreciated. static AVMutableVideoComposition *videoComposition = nil; - (void

Save AVCaptureVideoDataOutput to movie file using AVAssetWriter in Swift

断了今生、忘了曾经 提交于 2019-12-04 12:09:07
问题 I've been looking all over the web and can't seem to find a tutorial or help in what I need. Using AVFoundation and the Dlib library I've created an app that can detect a face from real time video using the front camera on the phone. I'm doing this using Shape Predictor 68 Face Landmark s. For this to work I'm pretty sure I have to use AVCaptureVideoDataOutput as opposed to AVMovieFileOutput so that each frame can be analysed. I now want to be able to save the video to file and from what I

Movement by a single frame in CMTime and AVFoundation

﹥>﹥吖頭↗ 提交于 2019-12-04 12:00:22
问题 I'm attempting to play a video with AVFoundation. I am using the following code for a button that advances the playback by one frame. It works intermittently, on some executions it will do the right thing and advance one frame, but most times I will have to press the button 3 or 4 times before it will advance a frame. This makes me think it is some kind of precision issue, but I can't figure out what it is. Each time it is run the new CMTime appears to be advancing by the same amount. My

Cropping AVAsset video with AVFoundation not working iOS 8

狂风中的少年 提交于 2019-12-04 11:57:27
This has been bugging me for the last day, I used to use this method in ObjC to crop videos into square, it seems to be the only method i've found in a few years that worked but after recently trying to crop using it in Swift & iOS 8 it doesn't seem to crop the video at all, Hopefully somebody can help? func captureOutput(captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAtURL outputFileURL: NSURL!, fromConnections connections: [AnyObject]!, error: NSError!) { if error != nil { println("Error Outputting recording") } else { self.writeVideoToAssetsLibrary(self.outputUrl!.copy()

iOS AVFoundation audio/video out of sync

[亡魂溺海] 提交于 2019-12-04 11:52:50
问题 The Problem: During every playback, the audio is between 1-2 seconds behind the video. The Setup: The assets are loaded with AVURLAssets from a media stream. To write the composition, I'm using AVMutableCompositions and AVMutableCompositionTracks with asymmetric timescales. The audio and video are both streamed to the device. The timescale for audio is 44100; the timescale for video is 600. The playback is done with AVPlayer. Attempted Solutions: Using videoAssetTrack.timeRange for

iOS - Automatically resize CVPixelBufferRef

泪湿孤枕 提交于 2019-12-04 11:25:28
I am trying to crop and scale a CMSampleBufferRef based on user's inputs, on ratio , the below code takes a CMSampleBufferRef, convert it into a CVImageBufferRef and use CVPixelBuffer to crop the internal image based on its bytes. The goal of this process is to have a cropped and scaled CVPixelBufferRef to write to the video - (CVPixelBufferRef)modifyImage:(CMSampleBufferRef) sampleBuffer { @synchronized (self) { CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); // Lock the image buffer CVPixelBufferLockBaseAddress(imageBuffer,0); // Get information about the image

merging videos together (AVFoundation)

≯℡__Kan透↙ 提交于 2019-12-04 10:03:27
问题 In my app, I'm recording small videos and adding them into an NSMutableArray as AVAsset so that i keep record of what has been captured. when the user press a button to merge them, the final result is only the first video taken (example, if three short videos where taken, the final result after merging is only the first video and the others do not appear). my code on iterating in the NSMutableArray and stitching the videos together is here: if (self.capturedVideos.count != 0) { //Create

AVAudioPlayer stop a sound and play it from the beginning

夙愿已清 提交于 2019-12-04 09:57:30
问题 I used the AVAudioPlayer to play a 10 sec wav file and it works fine. Now I what to stop the wav at the 4th sec and then play it again from the very 1st sec. Here is the code I tried: NSString *ahhhPath = [[NSBundle mainBundle] pathForResource:@"Ahhh" ofType:@"wav"]; AVAudioPlayer *ahhhhhSound =[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:ahhhPath] error:NULL]; [ahhhhhSound stop]; [ahhhhhSound play]; What I get is, the wav stops at the 4th sec but when I run the [XXX

Exporting AVCaptureSession video in a size that matches the preview layer

余生颓废 提交于 2019-12-04 09:34:34
问题 I'm recording video using AVCaptureSession with the session preset AVCaptureSessionPreset640x480 . I'm using an AVCaptureVideoPreviewLayer in a non-standard size (300 x 300) with the gravity set to aspect fill while recording. It's setup like this: self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession]; _previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; _previewLayer.frame = _previewView.bounds; // 300 x 300 [_previewView.layer addSublayer:

Switching Camera with a button in Swift

帅比萌擦擦* 提交于 2019-12-04 09:06:46
This seems to work to switch the camera from the back to the front, but I'm trying to come up with an 'if' statement so that I can switch it back too. Any ideas or advice? @IBAction func didTouchSwitchButton(sender: UIButton) { let camera = getDevice(.Front) let cameraBack = getDevice(.Back) do { input = try AVCaptureDeviceInput(device: camera) } catch let error as NSError { print(error) input = nil } if(captureSession?.canAddInput(input) == true){ captureSession?.addInput(input) stillImageOutput?.outputSettings = [AVVideoCodecKey : AVVideoCodecJPEG] if(captureSession?.canAddOutput