avfoundation

iOS AVFoundation - Show a time display over a video and export

Deadly 提交于 2019-12-20 09:07:42
问题 I want to show a display overlay over a video and export that video including this display. I had a look into the AVFoundation Framework, AVCompositions, AVAssets etc. but I still do not have an idea to achieve this. There is a class called AVSynchronizedLayer which lets you animate things synchrounous to the video, but I do not want to animate, I jsut want to overlay the time display into every single frame of the video. Any advice? Regards 回答1: Something like this... (NB: culled from a much

iOS AVFoundation - Show a time display over a video and export

喜你入骨 提交于 2019-12-20 09:06:40
问题 I want to show a display overlay over a video and export that video including this display. I had a look into the AVFoundation Framework, AVCompositions, AVAssets etc. but I still do not have an idea to achieve this. There is a class called AVSynchronizedLayer which lets you animate things synchrounous to the video, but I do not want to animate, I jsut want to overlay the time display into every single frame of the video. Any advice? Regards 回答1: Something like this... (NB: culled from a much

Is it possible to cache HLS segments with AVPlayer?

好久不见. 提交于 2019-12-20 08:28:58
问题 Root Problem​ Our video buffers a lot when seeking in iOS. It buffers quite a bit more than our web player which saves copies of the already watched segments in temp storage. ​Desired Solution​ Caching the video segments locally on the device's disk. We're fine with caching a single quality and always replaying it. ​Blocker​ We can't find a way to perform caching within AVFoundation/AVPlayer. What We've Tried 2 ways to intercept networking requests with AVPlayer. Conforming to

AVPlayer and MPMoviePlayerController differences [closed]

↘锁芯ラ 提交于 2019-12-20 08:21:33
问题 Closed . This question is opinion-based. It is not currently accepting answers. Want to improve this question? Update the question so it can be answered with facts and citations by editing this post. Closed 6 years ago . I am developing an iPhone application that needs to play videos. So far, I learned that there are at least two API's for achieving this; AVPlayer and MPMoviePlayerController . What are the main differences? 回答1: NOTE as of iOS9, Apple has deprecated the

ios endless video recording

那年仲夏 提交于 2019-12-20 08:05:04
问题 I'm trying to develop an iPhone app that will use the camera to record only the last few minutes/seconds. For example, you record some movie for 5 minutes click "save", and only the last 30s will be saved. I don't want to actually record five minutes and then chop last 30s (this wont work for me). This idea is called "Loop recording". This results in an endless video recording, but you remember only last part. Precorder app do what I want to do. (I want use this feature in other context) I

Cannot stop background music from within Game Scenes, Swift 3/Spritekit

白昼怎懂夜的黑 提交于 2019-12-20 07:22:06
问题 On XCODE 8/Swift 3 and Spritekit, i am playing background music (a 5 minute song), calling it from GameViewController's ViewDidLoad (from the parent of all the scenes, not from a specific GameScene), as I want it to play throughout scene changes without stopping. This happens without a problem. But my problem is, how do i stop the background music at will, when I am inside a scene? Say when user gets to a specific score on the game on the 3rd scene? As i cannot access the methods of the

Is there a way to intercept audio output from within your app to display back an audio visualizer on iOS?

末鹿安然 提交于 2019-12-20 06:48:25
问题 We're currently using Linphone library to make VOIP calls and they have their own solution for audio playback. However, we would like to display a visualizer for the audio that Linphone is outputting from within our own app. Is there a way that we can intercept this data (maybe through sample buffering) in order to draw up audio waves/volume meter in the user interface? AVAudioPlayer or AVPlayer is out of the question since we do not have access to those objects. Is there a solution in place

AVAudioSequencer Causes Crash on Deinit/Segue: 'required condition is false: outputNode'

非 Y 不嫁゛ 提交于 2019-12-20 06:36:31
问题 The below code causes a crash with the following errors whenever the object is deinitialized (e.g. when performing an unwind segue back to another ViewController): required condition is false: [AVAudioEngineGraph.mm:4474:GetDefaultMusicDevice: (outputNode)] Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: outputNode' The AVAudioSequencer is the root of the issue, because the error ceases if this is removed. How can this crash be

Composing Video and Audio - Video's audio is gone

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-20 03:14:35
问题 My question is, I am using the function below, to compose a video and audio. I want to keep video's original sound but it goes away somehow, I do not have any clue. I got this function from this answer I tried to change volumes right after appending AVMutableCompositionTrack s but it did not work For instance; mutableVideoCompositionTrack.prefferedVolume = 1.0 mutableAudioCompositionTrack.prefferedVolume = 0.05 But still, all you can hear is only the audio file. The function; private func

Why does averagePowerForChannel always return -160?

て烟熏妆下的殇ゞ 提交于 2019-12-20 03:09:33
问题 I have these code and both method call are SUCCESS. AVAudioSession *audioSession = [AVAudioSession sharedInstance]; [audioSession setCategory: AVAudioSessionCategoryPlayAndRecord error: NULL]; [audioSession setActive: YES error: NULL]; And these code to start recording: [self.recorder prepareToRecord]; [self.recorder recordForDuration: 60]; I have a timer function to update meters - (void)updateMeters { [self.recorder updateMeters]; float peakPower = [self.recorder averagePowerForChannel: 0];