avfoundation

Play reminder sound SwiftUI

孤街醉人 提交于 2021-01-21 11:47:28
问题 I understand I can play a sound from my own library, like: How to play a sound using Swift? But how can I use the already existing default Reminder Alerts sounds that iPhone has? Just to put it into context in a simple button press? Thanks 回答1: try this: import AVFoundation AudioServicesPlaySystemSound(1026) where the number 1026 is the SystemSound id. 来源: https://stackoverflow.com/questions/61310578/play-reminder-sound-swiftui

Get the accurate duration of a video

爷,独闯天下 提交于 2021-01-18 05:01:37
问题 I'm making a player and I want to list all files and in front of all files I want to present the duration of the video. The only problem is that I'm not getting the right video duration, sometimes it return a duration completely wrong. I've tried the below solution: let asset = AVAsset(url: "video.mp4") let duration = asset.duration.seconds So that it, the time sometimes give a value sometimes another. if someone know a possible solution I'm glad to heard. I have update the code using one

AVExportSession | Add UIView with GIFs and drawings on a video

一个人想着一个人 提交于 2021-01-16 04:23:11
问题 Objective : I have a Video over which I have a UIView which contains animated GIFs(not locally stored, but using giphy api), Texts, or hand drawings. I want to export this along with the image in a single video. What I did : I created a UIView on which the animations are. Then converted that to CALayer and added to video with AVMutableVideoCompotion. Problem : The UIView with animations is being converted to an Image instead of a video. How can I solve this. Below is the Program for my export

AVAssetWritter First Frames are either Blank or black

故事扮演 提交于 2021-01-07 01:28:47
问题 Problem: I am recording video frames by getting both audio and video buffers from CMSampleBuffer . Once the AssetWriter has finished writing the buffers, the final video results in first frame being black or either blank(considering it only considers the audio frames in the beginning). Although, randomly the video comes out totally normal and doesnt have a black frame. What I tried: I tried to wait until I fetch the first video frame and then start recording. Yet I get the same erratic

AVAssetWriter Unable to record audio with video | Crashing

回眸只為那壹抹淺笑 提交于 2021-01-05 08:55:53
问题 I am trying to capture video/Audio frames from CMSampleBuffer but completely failing to obtain a proper video recording. Expected Output: A Video file in .mp4 format that has both audio(from the mic) and video frames. Current Output: An Empty Directory/A video file without audio. Crashes on Run : Media type of sample buffer must match receiver's media type ("soun") I tried almost everything available online to troubleshoot this. I have a deadline coming and I just pulling my hair trying to

Swift 4 - avfoundation screen and audio recording using AVAssetWriter on mac os - video frozen

天涯浪子 提交于 2020-12-30 03:14:58
问题 I'm using Aperture to record audio and video from screen. We need to lower the bitrate of video, so i'm trying to rewrite it and record video with AVAssetWriter. My implementation is based on CustomCamera project and is almost working. The problem is in video, after few seconds it is frozen, althought audio is working. Could you help me please? I dont know where the problem is, maybe it is problem with buffers or garbage collector collect some variable. thanks. Here is the code: // //

UIImpactFeedbackGenerator Not Working When Audio Device Added to AVCaptureSession

左心房为你撑大大i 提交于 2020-12-29 03:02:04
问题 Adding microphone audio input to AVCaptureSession seems to disable UIImpactFeedbackGenerator . let audioDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio) let audioDeviceInput = try AVCaptureDeviceInput(device: audioDevice) if self.session.canAddInput(audioDeviceInput) { self.session.addInput(audioDeviceInput) } Once the audio device is removed, feedback resumes. Is this normal behavior? Is there a way around this? I notice stock iOS Camera app in video mode and long

UIImpactFeedbackGenerator Not Working When Audio Device Added to AVCaptureSession

怎甘沉沦 提交于 2020-12-29 03:01:06
问题 Adding microphone audio input to AVCaptureSession seems to disable UIImpactFeedbackGenerator . let audioDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio) let audioDeviceInput = try AVCaptureDeviceInput(device: audioDevice) if self.session.canAddInput(audioDeviceInput) { self.session.addInput(audioDeviceInput) } Once the audio device is removed, feedback resumes. Is this normal behavior? Is there a way around this? I notice stock iOS Camera app in video mode and long

AVMutableComposition rotates recorded video

爷,独闯天下 提交于 2020-12-27 06:18:22
问题 I am building an iOS app using the AVFoundation Framework, objective-C and Xcode 7.1. What i am doing is get a recorded video, add a text layer and export. My problem is that the exported video is on landscape when i recorded on portrait! I 've been struggled for hours and can't solve it. I found similar questions but none of the answers helped me. Here is my code: AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:self.videoURL options:nil]; AVMutableComposition* mixComposition =

Exporting time lapse with AVAssetExportSession results in black video

佐手、 提交于 2020-12-25 01:11:50
问题 I need to be able to merge videos taken with the time lapse function in the Camera app on iOS and export as a single video. However, even if I try to export a single, unchanged time lapse video to the Photo Library, it saves as a completely black video (with the correct duration). Here is the sample code I wrote to just export a single, unchanged video (most of which is adapted from a Ray Wenderlich tutorial): @IBAction func saveVideo(_ sender: UIBarButtonItem) { // 1 - Early exit if there's