avfoundation

AVVideoCompositionCoreAnimationTool not adding all CALayers

烂漫一生 提交于 2019-12-12 01:27:12
问题 Okay, this one has me completed stumped. I'm happy to post other code if you need it but I think this is enough. I cannot for the life of me figure out why things are going wrong. I'm adding CALayers, which contain images, to a composition using AVVideoCompositionCoreAnimationTool. I create an NSArray of all the annotations (see interface below) I want to add and then add them to the animation layer with an enumerator. No matter how many, as far as I can tell, annotations are in the array,

Why can't I rely on NSNotificationCenter using AVFoundation to loop video?

余生颓废 提交于 2019-12-11 23:19:31
问题 I loop my video played with AVFoundation video with the help of NSNotificationCenter and playerItemDidReachEnd: AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil]; self.playerItem = [AVPlayerItem playerItemWithAsset:asset]; self.avPlayer = [AVPlayer playerWithPlayerItem:self.playerItem]; [self.avPlayer addObserver:self forKeyPath:@"status" options:0 context:AVMoviePlayerViewControllerStatusObservationContext]; [[NSNotificationCenter defaultCenter] addObserver:self selector:

Resume AVPlayer after forwardPlaybackEndTime

余生长醉 提交于 2019-12-11 21:02:21
问题 I've created an AVPlayer and set the forwardPlaybackEndTime to make a local video stop at a given timestamp. Sure enough, the video stops at the time I've requested. All good. Now I want the video to continue when triggered by a user action (touching a button, for example). Unfortunately, I can't seem to make that happen without the video restarting from the beginning. I'll spare you all of the AVPlayer setup code (which is mostly taken from the AV Foundation Programming Guide), but given

PSA: [[AVAudioPlayer alloc] init] crashes in iOS 13.1+ ( and AVAudioPlayer() )

狂风中的少年 提交于 2019-12-11 19:10:31
问题 One of our apps uses an AudioManager class which handles playback of various audio files and it creates two AVAudioPlayer instances on init. Many of our methods work under the assumption that these two instances will never be nil, so we use the generic init method to create these two instances when the manager class's singleton is initialized, like this: - (id)init { self = [super init]; if(self){ _pushPlayer = [[AVAudioPlayer alloc] init]; _queuePlayer = [[AVAudioPlayer alloc] init]; }

Interperating AudioBuffer.mData to display audio visualization

馋奶兔 提交于 2019-12-11 18:31:58
问题 I am trying to process audio data in real-time so that I can display an on-screen spectrum analyzer/visualization based on sound input from the microphone. I am using AVFoundation's AVCaptureAudioDataOutputSampleBufferDelegate to capture the audio data, which is triggering the delgate function captureOutput . Function below: func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { autoreleasepool { guard captureOutput != nil

Add CGImage to CGImageDestination at specific index

爱⌒轻易说出口 提交于 2019-12-11 18:11:40
问题 Is there an option to add CGImage to specific index inside CGImageDestination ? We tried using CGImageDestinationAddImage but it adds images at a incremental order. We are creating some expensive CGImages where we need to add them in multiple indexes inside the destination. We thought about adding them to all needed "indexes" at once, saving the recreation. generator.generateCGImagesAsynchronously(forTimes: <BigArrayOfTimes>, completionHandler: { (requestedTime, expensiveCgImage, actualTime,

Converting AVAudioPCMBuffer to NSData

吃可爱长大的小学妹 提交于 2019-12-11 17:13:36
问题 I'm currently trying to convert the audio samples from AVAudioPCMBuffer to NSData - I had taken a look at the accepted answer on this SO Post and this code from GitHub but it appears some of the AVFAudio API's have changed...below is the extension I have for AVAudioPCMBuffer : private extension AVAudioPCMBuffer { func toNSData() -> NSData { let channels = UnsafeBufferPointer(start: int16ChannelData, count: 1) let ch0Data = NSData(bytes: channels[0], length:Int(frameCapacity * format

local notification not appearing (Swift4)

无人久伴 提交于 2019-12-11 16:57:02
问题 My code below is supposed to act as an alarm clock. When the date and time matches nothing is happening but I am seeing my print in the log file. Please try this code and test what I am doing it wrong. It seemed liked it worked before. This is supposed to still go off in the background if the user is not using the app. import UIKit;import AVFoundation;import UserNotifications class ViewController: UIViewController, UNUserNotificationCenterDelegate {var timer = Timer();var isGrantedAccess =

Pausing/muting background music in iOS

拟墨画扇 提交于 2019-12-11 16:52:21
问题 I've got an app that uses text-to-speech to occasionally give verbal cues to the user. Currently I've set it up so the TTS mixes over any music that's playing in another app. Is there any way I could temporarily pause/mute music that's playing in another app while a verbal cue plays? 回答1: You can use MPMusicPlayerController class and it's static method iPodMusicPlayer to get instance of the system (Music app) MPMusicPlayerController object. Then you can pause , play or anything else. The main

What kind of server is required for live streaming of video?

自作多情 提交于 2019-12-11 15:59:18
问题 I am making an iPhone application which will send video to a server for live streaming and I wanted to know that do we require a media server for this? 回答1: Yeah, You need to create a media server. You can send your streams to server from mobile using one of the many SDKs available. For media server: There are many ways that you can setup a server. For now lets see RTMP server which could be used with nginx. You can use hls(HTTP Live Streaming) as stated in above with this package. Here, the