avfoundation

Switching between headphone & speaker on iPhone

ぃ、小莉子 提交于 2019-12-11 04:42:19
问题 I am trying to set up the audio routing for an iPhone app outputting. I am using a route change listener to detect when the audio route has changed. The listener detects the changes, such as when the headphones are plugged in and out. By default, the speaker plays audio and then I plug my headphones in and the audio transmits through the headphones fine. From there, any changes do not occur, even though the route change listener is detecting them. Any help would be really appreciated. NSError

Xcode 9/Swift 4 AVCaptureMetadataOutput setMetadataObjectTypes use availableMetadataObjectTypes

旧时模样 提交于 2019-12-11 04:06:25
问题 There seems to be a lot of issues similar to what I am experiencing: AVmetadata changes with swift 4 xcode 9 AVCaptureMetadataOutput setMetadataObjectTypes unsupported type found And there is an Apple bug that deals with AVFoundation: https://forums.developer.apple.com/thread/86810#259270 But none of those seem to actually be the answer for me. I have code that runs great in swift 3, but will only error out in swift 4. Using the solutions in the above links results in no change at all. Code:

Load audio in DOCUMENTS when UITableView “cell” is pressed

时光怂恿深爱的人放手 提交于 2019-12-11 04:04:09
问题 I have a UITableView displaying a llist f files in the Documents Directory... and I want it to play the audio file in the Documents directory when pressed... It runs with no errores but doesn't play the audio when pressed... - (void)tableView:(UITableView *)tableView didSelectRowAtIndexPath:(NSIndexPath *)indexPath { NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSString *fileName =

Xcode 6 - Press button to play sound

佐手、 提交于 2019-12-11 03:51:48
问题 Looking for some help trying to get my button to play sound. I've looked up over 50 tutorials but all of them are outdated and do not work with the new Xcode. Does anyone have a good resource to learn? Fairly new to Objective-C. I've looked through the Apple documentation for the Audio frameworks and my brain cells committed suicide. Any help or pointers in the right direction would be greatly appreciated. 回答1: in viewController.h #import <AudioToolbox/AudioToolbox.h> #import <AVFoundation

Get meta data displayed in MPNowPlayingInfoCenter's nowPlayingInfo(lock screen and remote control)

浪子不回头ぞ 提交于 2019-12-11 03:46:05
问题 Thanks for noticing this question. I want to do something about music recommendation, and what I am doing now is leveraging MPNowPlayingInfoCenter 's nowPlayingInfo , like this: NSDictionary *metaData = [[MPNowPlayingInfoCenter defaultCenter] nowPlayingInfo]; NSString *songTitle = metaData[MPMediaItemPropertyTitle]; NSString *albumnTitle = metaData[MPMediaItemPropertyAlbumTitle]; NSString *artist = metaData[MPMediaItemPropertyArtist]; But it always returns nil when "Music" app is playing

AVAudioUnitEQ / .BandPass filter doesn't work

别等时光非礼了梦想. 提交于 2019-12-11 03:26:41
问题 I can't get the AVAudioUnitEQ to work. Here's a piece of code that should filter out everything except 659.255Hz +/-0.05 octaves: // Create Audio Engine var audioEngine = AVAudioEngine() // Create Equalizer Node var equalizerNode = AVAudioUnitEQ(numberOfBands: 1) var epualizerParameters: AVAudioUnitEQFilterParameters = equalizerNode.bands.first as AVAudioUnitEQFilterParameters epualizerParameters.filterType = .BandPass epualizerParameters.frequency = 659.255 epualizerParameters.bandwidth = 0

AVComposition doesn't play via Airplay Video

♀尐吖头ヾ 提交于 2019-12-11 03:18:41
问题 My AVMutableComposition which contains two locally stored video files plays fine on the iPad but cannot play via AirPlay Video. My AVPlayer implementation works fine for regular AVURLAssets over AirPlay Video, just not for AVComposition assets. Both locally stored video files play fine via Airplay Video when loaded as individual AVURLAssets. Is this a limitation with AirPlay Video? Or is there something I need to do to the composition? 来源: https://stackoverflow.com/questions/9692564

Remote mp3 file taking a lot of time to play in swift ios

吃可爱长大的小学妹 提交于 2019-12-11 02:54:35
问题 i'm in trouble. i want to play a remote mp3 file in my app. but mp3 file taking a lot of time (approx 5-6 minute) to play. why ? Anyone can suggest what should i do ? import UIKit import AVFoundation class TestViewController: UIViewController, AVAudioPlayerDelegate { var player:AVAudioPlayer = AVAudioPlayer() override func viewDidLoad() { super.viewDidLoad() } @IBAction func play(sender: AnyObject) { let url = "http://www.example.com/song.mp3" let fileURL = NSURL(string: url

AVAudioPlayer doesn't load sound

本秂侑毒 提交于 2019-12-11 02:42:50
问题 For whatever reason, I'm having trouble loading up a sound with the AVAudioPlayer . I am testing it on my iPad device. My code is fairly straight forward, nothing fancy. The framework is imported, delegate implemented. In my viewDidLoad method: NSString *path = [[NSBundle mainBundle] pathForResource:@"sound" ofType:@"mp3"]; AVAudioPlayer *theAudio = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:path] error:NULL]; [theAudio setDelegate:self]; [theAudio prepareToPlay];

Convert a UIImage to a CIImage to crop to a CGRect. AVFoundation

社会主义新天地 提交于 2019-12-11 01:53:04
问题 So I have a UIImage which I want to crop. I looked and found imageByCroppingToRect method for CIImage. So, I converted the data to CIImage instead of UIImage, crop it using the specified method and then convert the resulting CIImage to UIImage and then display it in a UIImageView. My code is NSData *data = [[NSData alloc]initWithData:[def objectForKey:@"imageData"]]; //UIImage *normalImage = [[UIImage alloc]initWithData:data]; CIImage *originalImage = [CIImage imageWithData:data];