avfoundation

How to create an App using the Single View App template where the main window does not rotate but the rest does?

倖福魔咒の 提交于 2020-01-05 05:29:07
问题 How to create an App using the Single View App template where the main window does not rotate but its rootViewController and everything else autorotates? Apple does that on CIFunHouse but because the code is poorly explained in that matter, it is impossible to know how they did it. If you run the app you will see that the camera's preview window does not autorotate because the preview was added to the window but everything else does. Apple uses this technique on their native iPad camera app.

AVAssetReader reads audio, then fails

流过昼夜 提交于 2020-01-05 00:52:29
问题 My app reads audio and plays it back in a producer / consumer setup. The consumer thread requests new samples to render to hardware. The producer thread reads audio data from disk into its buffer using AVAssetReader. The producer thread runs in a loop, checking if more samples need to be read. The producer's buffer size is equal to 4 seconds of audio. When I instruct my app to buffer audio, samples are read successfully without error. When I trigger my producer thread to begin rendering audio

Using AVAudioPlayer across multiple scenes Swift and be able to adjust volume

自作多情 提交于 2020-01-04 17:59:51
问题 Hi I was wondering if there was a way to use an AVAudioPlayer across multiple scenes and change the volume. I'm currently placing it in my gameViewController which is the controller for all of my application's scenes: override func viewDidAppear(animated: Bool) { let backgroundMusicURL = NSBundle.mainBundle().URLForResource("tempMusic.mp3", withExtension: nil) backgroundMusicPlayer = AVAudioPlayer(contentsOfURL: backgroundMusicURL, error:nil) backgroundMusicPlayer.numberOfLoops = -1

Exporting a mirrored video with AVMutableComposition causes resizing issues

心已入冬 提交于 2020-01-04 06:14:56
问题 Everything works as expected if I turn off the mirroring on the front camera. However, if I turn it on, my final exported video has crucial resizing problems: This is how I currently manage the mirroring for my videos: if currentDevice == frontCamera { if let connection = output.connections.first { if connection.isVideoMirroringSupported { connection.automaticallyAdjustsVideoMirroring = false connection.isVideoMirrored = true //if true, this bug occurs. } } }else { //disabling photo mirroring

How to get System Volume iOS?

夙愿已清 提交于 2020-01-04 04:18:07
问题 I found an example here, but it does not work on ios6.1.3 (iphone 4s). always return 0.187500 Code: Float32 volume; UInt32 dataSize = sizeof(Float32); AudioSessionInitialize (NULL, NULL, NULL, NULL); AudioSessionGetProperty ( kAudioSessionProperty_CurrentHardwareOutputVolume, &dataSize, &volume ); NSLog(@"%f", volume); [AVAudioSession sharedInstance].outputVolume - also return 0.187500 回答1: You need to initialize the audio session with: AudioSessionInitialize (NULL, NULL, NULL, NULL); in

AVAssetExportSession fails on IOS 13, muxing together audio and video

∥☆過路亽.° 提交于 2020-01-04 00:30:09
问题 This code works (and still does) on all pre-IOS 13 devices. Currently howerver, am getting this error after the exportAsynchronously call runs: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12735), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x282e194a0 {Error Domain=NSOSStatusErrorDomain Code=-12735 "(null)"}} Unsure if IOS 13 adds/changes some

AVAssetExportSession fails on IOS 13, muxing together audio and video

自作多情 提交于 2020-01-04 00:30:09
问题 This code works (and still does) on all pre-IOS 13 devices. Currently howerver, am getting this error after the exportAsynchronously call runs: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12735), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x282e194a0 {Error Domain=NSOSStatusErrorDomain Code=-12735 "(null)"}} Unsure if IOS 13 adds/changes some

Losing “Now Playing” status from MPRemoteCommandCenter

主宰稳场 提交于 2020-01-03 21:08:22
问题 I am creating an application for iOS that can be controlled using the MPRemoteCommandCenter . This works fine. When changing the application AVAudioSession category from AVAudioSessionCategoryPlayback to AVAudioSessionCategoryPlayback, withOptions: .MixWithOthers , it stops receiving remote control events. This is fine. But when I change the category back to AVAudioSessionCategoryPlayback , I do not receive events from MPRemoteCommandCenter as expected. How can I reclaim "Now Playing" status

AVAssetWriter failing to encode video with AVVideoProfileLevelKey

こ雲淡風輕ζ 提交于 2020-01-03 20:05:56
问题 Can anyone help me out with this problem? I am able to encode a video using AVAssetWriter with following output settings. NSDictionary *videoCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey, [NSNumber numberWithInteger:dimensions.width], AVVideoWidthKey, [NSNumber numberWithInteger:dimensions.height], AVVideoHeightKey, [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInteger:bitsPerSecond], AVVideoAverageBitRateKey, [NSNumber

AVFoundation -Videos merge but only last video plays

瘦欲@ 提交于 2020-01-03 17:28:00
问题 I have an array of [AVAsset]() . Whenever I record different videos at different durations the below code merges all the durations into 1 video but it will only play the last video in a loop. For eg. video1 is 1 minute and shows a dog walking, video2 is 1 minute and shows a bird flying, video3 is 1 minute and shows a horse running. The video will merge and play for 3 minutes but it will only show the horse running for 1 minute each three consecutive times. Where am I going wrong at? var