avaudioplayer

kSystemSoundID_Vibrate not working with AVAudioPlayer Play

微笑、不失礼 提交于 2021-02-19 02:04:18
问题 I am working on a half-duplex VOIP call app. In that I am trying to play tones along-with vibration during floor exchanges - floor exchanges are done to change the talker from A to B or vice-versa for a half-duplex call. If I try to play tones using AudioServicesPlayAlertSound, then it plays sound when user gets the call, but not during floor exchanges, due to possible mixing issues with call audio. So I replaced AudioServicesPlayAlertSound with AVAudioPlayer's play function. So, now I am

Play audio from CMSampleBuffer

百般思念 提交于 2021-02-18 10:47:29
问题 I have created a video chat app for groups in iOS. I have been searching for some ways to control the audio volume for different participant separately. I found way to mute and unmute using isPlaybackEnabled in RemoteAudioTrack , but not to control volume. I also thought if we can use it in AVAudioPlayer . I found addSink . This is what I tried from here: class Audio: NSObject, AudioSink { var a = 1 func renderSample(_ audioSample: CMSampleBuffer!) { print("audio found", a) a += 1 var

iOS: trimming audio files with Swift?

自闭症网瘾萝莉.ら 提交于 2021-02-17 15:00:12
问题 I have to merge the audio file and recorded voice.For example the recorded voice is 47seconds. I have to cut or trim the 4minutes audio song to 47seconds. And merge the audio file. var url:NSURL? if self.audioRecorder != nil { url = self.audioRecorder!.url } else { url = self.soundFileURL! } print("playing \(url)") do { self.newplayer = try AVPlayer(URL: url!) let avAsset = AVURLAsset(URL: url!, options: nil) print("\(avAsset)") let audioDuration = avAsset.duration let totalSeconds =

Buffer size for AVPlayer / AVPlayerItem

耗尽温柔 提交于 2021-02-05 13:11:36
问题 I'm creating a streaming radio application for iOS and I would like to tweak the properties of AVPlayer and AVPlayerItem to give me more reliable playback in lossy connectivity conditions. I would like to increase the buffersize. The only answer I could find is here Is there anyway to achieve this without going to OpenAL? 回答1: Add the below piece of code in your observer method. NSArray *timeRanges = (NSArray *)[change objectForKey:NSKeyValueChangeNewKey]; CMTimeRange timerange = [timeRanges

Buffer size for AVPlayer / AVPlayerItem

寵の児 提交于 2021-02-05 13:11:27
问题 I'm creating a streaming radio application for iOS and I would like to tweak the properties of AVPlayer and AVPlayerItem to give me more reliable playback in lossy connectivity conditions. I would like to increase the buffersize. The only answer I could find is here Is there anyway to achieve this without going to OpenAL? 回答1: Add the below piece of code in your observer method. NSArray *timeRanges = (NSArray *)[change objectForKey:NSKeyValueChangeNewKey]; CMTimeRange timerange = [timeRanges

Buffer size for AVPlayer / AVPlayerItem

那年仲夏 提交于 2021-02-05 13:10:39
问题 I'm creating a streaming radio application for iOS and I would like to tweak the properties of AVPlayer and AVPlayerItem to give me more reliable playback in lossy connectivity conditions. I would like to increase the buffersize. The only answer I could find is here Is there anyway to achieve this without going to OpenAL? 回答1: Add the below piece of code in your observer method. NSArray *timeRanges = (NSArray *)[change objectForKey:NSKeyValueChangeNewKey]; CMTimeRange timerange = [timeRanges

Buffer size for AVPlayer / AVPlayerItem

百般思念 提交于 2021-02-05 13:10:00
问题 I'm creating a streaming radio application for iOS and I would like to tweak the properties of AVPlayer and AVPlayerItem to give me more reliable playback in lossy connectivity conditions. I would like to increase the buffersize. The only answer I could find is here Is there anyway to achieve this without going to OpenAL? 回答1: Add the below piece of code in your observer method. NSArray *timeRanges = (NSArray *)[change objectForKey:NSKeyValueChangeNewKey]; CMTimeRange timerange = [timeRanges

AVAudioPlayer Swift 3 not playing sound [duplicate]

瘦欲@ 提交于 2021-02-04 21:00:26
问题 This question already has an answer here : iOS Swift: Sound not playing (1 answer) Closed 3 years ago . I added the AVFoundation.framework to my project. In my project navigator I added the file "Horn.mp3", this is a sound of 1 second. When a button is pressed (with a image of a horn) the sound should play, also should a label change it's text. The label is changing it's text, but the sound isn't playing. This is my code: import UIKit import AVFoundation class ViewController: UIViewController

Swift: AudioPlayerDidFinished will not be called

二次信任 提交于 2021-01-28 07:30:44
问题 My AudioPlayerDidFinishPlaying will not be called after audio has finished. I know it has something to do with my delegate, but I can't fix it on my own. Can somebody give me some tips? I Googled a lot and I found other questions here with the same issue but it didn't work for me. Thanks import UIKit import Parse import AVFoundation class ViewControllerMies: UIViewController, AVAudioPlayerDelegate { var timer = NSTimer() var player: AVAudioPlayer = AVAudioPlayer() var currentStateAudio = ""

AVAudioPlayer NOT playing through speaker after recording with AVAudioRecorder

可紊 提交于 2021-01-07 03:26:58
问题 I'm working on an app that does audio recording and playback of recorded audio. I'm using AVAudioSession to change the category to playAndRecord as well as passing it to defaultToSpeaker . My problem is , if I launch the app, play an earlier recording, it plays through the bottom (louder) speaker as I want it to and expected BUT if I launch the app and start recording a new memo then play it back, no matter what I do, it will always use the quieter (phone call) speaker that's next to front