avfoundation

AVAssetWriter rotate buffer for video orientation

北慕城南 提交于 2019-12-11 15:40:40
问题 I'm working on a live recording app in Swift using AVFoundation and I have an issue with the video orientation. I use AVAssetWriter and not AVCaptureMovieFileOutput because I need to record in square format (correct me if I'm wrong). I tried to use videoInput.transform but I heard that it is not supported in all video player. I can't use avcaptureconnection.videoOrientation based on the device orientation because there is some "Main UI thread stop". I read that the best solution is to rotate

iOS CIFaceDetector very slow with Metal

孤人 提交于 2019-12-11 15:37:21
问题 I've been trying to apply filters to a certain part of the face detected in an image. In order to apply filters to the whole image, I used the sample code from apple: https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/avcamfilter_applying_filters_to_a_capture_stream If I just add one line of detecting faces via CIDetector, to the method which sends out CVPixelBuffers to FilterRenderer class and then to MTKView to render the filtered buffer, the performance is

AVAudioEngine Microphone Crash on Start

女生的网名这么多〃 提交于 2019-12-11 15:08:14
问题 I'm trying to set up an AudioQueue to stream audio from the microphone on an iPhone. I create my audio engine: var audioEngine = AVAudioEngine() And my audio queue: // Serial dispatch queue used to analyze incoming audio buffers. let analysisQueue = DispatchQueue(label: "com.apple.AnalysisQueue") // Install an audio tap on the audio engine's input node. audioEngine.inputNode.installTap(onBus: 0, bufferSize: 8192, // 8k buffer format: inputFormat) { buffer, time in // Analyze the current audio

Best way for looping and using background sounds in iphone?

大憨熊 提交于 2019-12-11 15:08:07
问题 Lets say I am dragging my finger on screen and I have 1 second .caf sound file in my bundle. So what is the best way to play my sound file in a loop till i am dragging my fingers. And it should stop whenever I remove touches. I know touches implementation. Just post your views about using sound file. 回答1: See AVAudioPlayer class, it worked pretty well for me for similar behaviour described in your question. 回答2: This is how I did it. At the moment, I don't know of a more efficient method..

AVAssetTrack with alpha channel?

一世执手 提交于 2019-12-11 14:56:37
问题 I am trying to layer AVAssetTracks in an AVMutableComposition where the AVAssetTracks have an alpha channel in the video. I have successfully exported an video AVAsset with pixel buffers than have an alpha channel and transparency, but when I try to add that into an AVMutableCompositionTrack and layer it, it doesnt display correctly. Is this possible? Is the magic performed when the AVAsset is created or in a the AVMutableComposition or AVVideoComposition? Any clues would be a huge help. 回答1:

How to Play a video in AVPlayer for a specific duration in Swift3 iOS

走远了吗. 提交于 2019-12-11 14:18:38
问题 I am creating a Video application in Swift3 . Where we have a list of Video files in a TableView list. For each Video we have given Range Slider option for user to select Range of the Video . Now I am trying to play the Video for that specific range, selected by the User. I am using below code to start a Video from 4 Seconds to 8 Seconds using CMTimeMake but not playing correctly. let targetTime:CMTime = CMTimeMake(4, 1)////Video start self.player?.seek(to: targetTime) self.player?

AVAssetExportSession - Join 2 mp4 files in IOS

本小妞迷上赌 提交于 2019-12-11 13:35:08
问题 I am trying to join 2 preexisting mpeg4 video's together on an ipad2 with the following code. -(void)mergeTestVideos { //setup asset NSString *firstassetpath = [NSString stringWithFormat:@"%@mpeg4-1.mp4", NSTemporaryDirectory()]; NSString *secondassetpath = [NSString stringWithFormat:@"%@mpeg4-2.mp4", NSTemporaryDirectory()]; NSFileManager *fileManager = [NSFileManager defaultManager]; AVAsset *firstAsset = [AVAsset assetWithURL:[NSURL fileURLWithPath:firstassetpath]]; AVAsset *secondAsset =

Constants for AVCaptureMetadataOutput's metadataObjectTypes property?

不问归期 提交于 2019-12-11 12:54:11
问题 When using AVFoundation to detect features (e.g. faces or barcodes) in an image, you must call a line such as: AVCaptureMetadataOutput *metadataOutput = ...; metadataOutput.metadataObjectTypes = metadataOutput.availableMetadataObjectTypes; Examining availableMetadataObjectTypes shows the following strings: face, "org.gs1.UPC-E", "org.iso.Code39", "org.iso.Code39Mod43", "org.gs1.EAN-13", "org.gs1.EAN-8", "com.intermec.Code93", "org.iso.Code128", "org.iso.PDF417", "org.iso.QRCode", "org.iso

Simultaneously downloading and playing a song that is pieced together from multiple URLs

别来无恙 提交于 2019-12-11 12:49:31
问题 I have an NSArray of strings that are URLs. Each URL in the array points to data that is associated with a section of the song. I can play the full song with the following code, which fully downloads the song to a single file then plays it from that file: // Get file path to store song locally NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *filePath = [NSString stringWithFormat:@"%@/temp.mp3", [paths objectAtIndex:0]]; // Remove

Scale the video up from a smaller size(Re-scale)

为君一笑 提交于 2019-12-11 12:21:45
问题 I'm actually looking for is to not just the quality but resize the entire video to a greater resolution using the AV foundation. I have a videos in 320x240 and 176x144 in mp4 quality and I want to resize video upto size 1280x720 , but AVAssetExportSession class not allow to scale the video up from a smaller size. 回答1: try AVMutableVideoCompositionLayerInstruction and CGAffineTransform. This code will help the understanding. https://gist.github.com/zrxq/9817265 来源: https://stackoverflow.com