avfoundation

AVAudioSession - How to switch between speaker and headphones output

北城余情 提交于 2020-06-27 09:06:06
问题 I'm trying to mimic behaviour as in Phone app during calling. You can easily switch output sources from/to speaker or headphones. I know I can force speaker as an output when headphones are connected by calling: try! audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord) try! audioSession.overrideOutputAudioPort(.speaker) However, when I do that, I don't see any way to detect if headphones are still connected to the device. I initially thought outputDataSources on AVAudioSession would

AVAudioSession - How to switch between speaker and headphones output

陌路散爱 提交于 2020-06-27 09:05:10
问题 I'm trying to mimic behaviour as in Phone app during calling. You can easily switch output sources from/to speaker or headphones. I know I can force speaker as an output when headphones are connected by calling: try! audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord) try! audioSession.overrideOutputAudioPort(.speaker) However, when I do that, I don't see any way to detect if headphones are still connected to the device. I initially thought outputDataSources on AVAudioSession would

Swift 4 switch to new observe API

百般思念 提交于 2020-06-26 04:30:47
问题 I am having trouble with the new observe API in Swift 4. player = AVPlayer() player?.observe(\.currentItem.status, options: [.new], changeHandler: { [weak self] (player, newValue) in if let status = AVPlayer.Status(rawValue: (newValue as! NSNumber).intValue) { } } But I get an error Type of expression is ambiguous without more context. How do I fix it? Not sure about keyPath syntax. There is also a warning in extracting AVPlayerStatus in the closure above Cast from 'NSKeyValueObservedChange'

How to play AVPlayerItems immediately

為{幸葍}努か 提交于 2020-06-25 05:19:08
问题 I'm using AVPlayer which I connect to an remote URL via an AVPlayerItem . The problem is that I want to play the sound from the URL immediately and don't let the user wait for the AVPlayer to buffer. The thing is that if the remote URL's asset media is very short then it doesn't buffer for long at all but if the media is a bit longer it takes a while. Is there a way to skip the buffering process or at least shorten it substantially? 回答1: Swift 5 You just need to use the

Build a simple Equalizer

旧巷老猫 提交于 2020-06-25 00:57:26
问题 I would like to make a 5-band audio equalizer (60Hz, 230Hz, 910Hz, 4kHz, 14kHz) using AVAudioEngine . I would like to have the user input gain per band through a vertical slider and accordingly adjust the audio that is playing. I tried using AVAudioUnitEQ to do this, but I hear no difference when playing the audio. I tried to hardcode in values to specify a gain at each frequency, but it still does not work. Here is the code I have: var audioEngine: AVAudioEngine = AVAudioEngine() var

I want to make a sound play starting from the 15th second [closed]

牧云@^-^@ 提交于 2020-06-18 11:59:10
问题 Closed . This question needs details or clarity. It is not currently accepting answers. Want to improve this question? Add details and clarify the problem by editing this post. Closed 5 years ago . Improve this question I want to make a sound play starting from the 15th second. How can I do it programmatically? Thanks! 回答1: As I can understand you are trying to start your audio not from the beginning. In this case: You need to use seekToTime import UIKit import AVFoundation class

I want to make a sound play starting from the 15th second [closed]

巧了我就是萌 提交于 2020-06-18 11:59:08
问题 Closed . This question needs details or clarity. It is not currently accepting answers. Want to improve this question? Add details and clarify the problem by editing this post. Closed 5 years ago . Improve this question I want to make a sound play starting from the 15th second. How can I do it programmatically? Thanks! 回答1: As I can understand you are trying to start your audio not from the beginning. In this case: You need to use seekToTime import UIKit import AVFoundation class

AVAssetImageGenerator returns sometimes same image from 2 successive frames

时间秒杀一切 提交于 2020-06-09 12:52:27
问题 I'm currently extracting every frame from a video with AVAssetImageGenerator , but sometimes it returns me successively 2 times almost the same image (they do not have the same "frame time"). The funny thing is it always happen (in my test video) each 5 frames. Here and here are the two images (open each in new tab then switch the tabs to see the differences). Here's my code : //setting up generator & compositor self.generator = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];

AVFoundation: Fit Video to CALayer correctly when exporting

徘徊边缘 提交于 2020-05-28 03:29:32
问题 Problem: I'm having issues getting videos I'm creating with AVFoundation to show in the VideoLayer, a CALayer , with correct dimensions. Example: Here is what the video should look like (as its displayed to the user in the app) However, here's the resulting video when it's exported: Details As you can see, its meant to be a square video, with green background, with the video fitting to a specified frame. However, the resulting video doesn't fit the CALayer used to contain it (see the black

Close AVPlayer when movie is complete

一曲冷凌霜 提交于 2020-05-25 17:23:15
问题 I am making a simple iPad app to play a movie when a button is pressed. The movie plays and when the movie is finished I want to close AVPlayerView so it goes back to the main screen. Currently when the video finishes it stays on the last frame. My ViewController.Swift at the moment. import UIKit import AVKit import AVFoundation class ViewController: UIViewController { //MARK : Properties override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view,