avfoundation

Making CIContext.render(CIImage, CVPixelBuffer) work with AVAssetWriter

為{幸葍}努か 提交于 2019-12-06 07:47:34
问题 I want to use Core Image for processing a bunch of CGImage objects and turning them into a QuickTime movie on macOS . The following code demonstrates what's needed, but the output contains a lot of blank (black) frames: import AppKit import AVFoundation import CoreGraphics import Foundation import CoreVideo import Metal // Video output url. let url: URL = try! FileManager.default.url(for: .downloadsDirectory, in: .userDomainMask, appropriateFor: nil, create: false).appendingPathComponent("av

How to record video and play audio at the same time (swift tutorial)

喜你入骨 提交于 2019-12-06 05:57:37
问题 So you want to record a video and play music from the user's library at the same time ? Look no further. Below is the answer. 回答1: For the audio playback you will use AVAudioPlayer . All you have to do is to declare the AVAudioPlayer as a global variable (I named it audioPlayer ) and implement the code below. Use this in after the user chose the song he/she wants to play: func mediaPicker(mediaPicker: MPMediaPickerController, didPickMediaItems mediaItemCollection: MPMediaItemCollection) { let

iOS - Automatically resize CVPixelBufferRef

旧时模样 提交于 2019-12-06 05:51:18
问题 I am trying to crop and scale a CMSampleBufferRef based on user's inputs, on ratio , the below code takes a CMSampleBufferRef, convert it into a CVImageBufferRef and use CVPixelBuffer to crop the internal image based on its bytes. The goal of this process is to have a cropped and scaled CVPixelBufferRef to write to the video - (CVPixelBufferRef)modifyImage:(CMSampleBufferRef) sampleBuffer { @synchronized (self) { CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); //

iOS Swift : Error Domain=NSOSStatusErrorDomain Code=-12792?

╄→гoц情女王★ 提交于 2019-12-06 05:29:54
I'm trying to get video thumbnails with the following code: let asset = AVAsset(URL: url) let imageGenerator = AVAssetImageGenerator(asset: asset) imageGenerator.appliesPreferredTrackTransform = true do { let cgImage = try imgGenerator.copyCGImageAtTime(CMTimeMake(1, 30), actualTime: nil) let uiImage = UIImage(CGImage: cgImage) imageview.image = uiImage } catch let error as NSError { print("Image generation failed with error \(error)") } Sometimes it works and sometime it doesn't showing the following error: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed

Objective-C : No Matter what I do CIDetector is always nil

旧巷老猫 提交于 2019-12-06 05:16:20
Trying to get a simple Proof of concept going with Apple's face detection API. I've looked at a couple of other examples including Apple's SquareCam, and this one https://github.com/jeroentrappers/FaceDetectionPOC based on these, it seems like I am following the correct pattern to get the APIs going, but I am stuck. No matter what I do, the CIDetector for my face detector is always nil!!! I would seriously appreciate any help, clues - hints - suggestions! -(void)initCamera{ session = [[AVCaptureSession alloc]init]; AVCaptureDevice *device; /* if([self frontCameraAvailable]){ device = [self

How to apply reverb filter or any other sound effect to a .wav sound file?

霸气de小男生 提交于 2019-12-06 05:03:47
I need to apply the reverb filter to my sound file in my ipad app. I just found a keyword AVMetadataID3MetadataKeyReverb in the apple documentation, but not able to get how to use it. This has been added from iOS 4.0. The AVMetadataID3MetadataKeyReverb constant represents the RVRB field of and ID3(V2) tag - which is simply a piece of metadata that's part of an audio container file (like MP3). The constant isn't related to applying an actual reverb effect to a piece of audio data, but to identifying different parts of ID3 tags when using the AV Foundation to retreive them from an audio file...

Output Video Size Huge Using HEVC Encoder on iOS

时光毁灭记忆、已成空白 提交于 2019-12-06 04:57:21
I have a project that currently uses the H.264 encoder to record video on iOS. I wanted to try using the new HEVC encoder in iOS 11 to reduce file sizes, but have found that using the HEVC encoder causes file sizes to balloon enormously. Here's a project on GitHub that shows the issue - it simultaneously writes frames from the camera to files using the H.264 and H.265 (HEVC) encoders, and the resulting file sizes are printed to the console. The AVFoundation classes are setup like this: class VideoWriter { var avAssetWriterInput: AVAssetWriterInput var avAssetWriter: AVassetWriter init() { if

iOS: Synchronizing frames from camera and motion data

 ̄綄美尐妖づ 提交于 2019-12-06 04:10:32
问题 I'm trying to capture frames from camera and associated motion data. For synchronization I'm using timestamps. Video and motion is written to a file and then processed. In that process I can calculate motion-frames offset for every video. Turns out motion data and video data for same timestamp is offset from each other by different time from 0.2 sec up to 0.3 sec. This offset is constant for one video but varies from video to video. If it was same offset every time I would be able to subtract

Swift 3 : How to export video with text using AVVideoComposition

穿精又带淫゛_ 提交于 2019-12-06 04:04:55
I am trying to use AVVideoComposition to add some text on top of a video and save the video. This is the code I use: I Create an AVMutableComposition and AVVideoComposition var mutableComp = AVMutableComposition() var mutableVidComp = AVMutableVideoComposition() var compositionSize : CGSize? func configureAsset(){ let options = [AVURLAssetPreferPreciseDurationAndTimingKey : "true"] let videoAsset = AVURLAsset(url: Bundle.main.url(forResource: "Car", withExtension: "mp4")! , options : options) let videoAssetSourceTrack = videoAsset.tracks(withMediaType: AVMediaTypeVideo).first! as AVAssetTrack

Create a copy of CMSampleBuffer in Swift 2.0

☆樱花仙子☆ 提交于 2019-12-06 03:49:26
问题 This has been asked before, but something must have changed in Swift since it was asked. I am trying to store CMSampleBuffer objects returned from an AVCaptureSession to be processed later. After some experimentation I discovered that AVCaptureSession must be reusing its CMSampleBuffer references. When I try to keep more than 15 the session hangs. So I thought I would make copies of the sample buffers. But I can't seem to get it to work. Here is what I have written: var allocator: Unmanaged