avfoundation

How to resume background audio in Swift 2 / AVPlayer?

拈花ヽ惹草 提交于 2019-12-03 05:17:50
问题 I am learning Swift as my first programming language. I've struggled for many hours to resume background audio playback after interruption (eg call) What should happen: Audio keeps playing when app goes to background (works) When interrupted by a call, get the notification for interruption began (works) When call ends, get the notification for interruption ends (works) Resume playing the audio ( does NOT work - hear nothing ) Really appreciate any help! Thanks Notes: The app is registered for

Using Apple's new AudioEngine to change Pitch of AudioPlayer sound

大兔子大兔子 提交于 2019-12-03 05:13:30
问题 I am currently trying to get Apple's new audio engine working with my current audio setup. Specifically, I am trying to change the pitch with Audio Engine, which apparently is possible according to this post. I have also looked into other pitch changing solutions including Dirac and ObjectAL, but unfortunately both seem to be pretty messed up in terms of working with Swift, which I am using. My question is how do I change the pitch of an audio file using Apple's new audio engine. I am able to

AVFoundation - Adding blur background to video

做~自己de王妃 提交于 2019-12-03 05:10:58
I am working on a video editing app in Swift . In my case my output video looks like as following I am trying to fill the black portion with blur effect exactly like this I searched but didn't get any working solution. Any assistance would be a great help. Swift 4 - Adding blur background to video 1. Single video support ❤️ 1. Multiple videos merging support ❤️ 2. Support any canvas in any ratio ❤️ 3. Save final video to camera roll ❤️ 5. Manage all video orientations ❤️ May be I'm late for this answer but still I didn't find any solution for this requirement. So sharing my work: 👍👍 ⭐ Download

What does shouldOptimizeForNetworkUse actually do?

╄→尐↘猪︶ㄣ 提交于 2019-12-03 04:54:18
问题 From the Apple documentation it just says: When the value of this property is YES, the output file will be written in such a way that playback can start after only a small amount of the file is downloaded. But what is actually happening? 回答1: When shouldOptimizeForNetworkUse is set to YES calling finishWriting will move the MP4 moov atom (movie atom) from the end of the file to the beginning of the file. The moov atom contains information about the movie file like timescale and duration. The

How to Convert CMSampleBuffer/UIImage into ffmpeg's AVPicture?

自作多情 提交于 2019-12-03 04:36:47
问题 I'm trying to encode iPhone's camera frames into a H.264 video using ffmpeg's libav* libraries. I found in this Apple's article how to convert CMSampleBuffer to UIImage, but how can I convert it to ffmpeg's AVPicture? Thanks. 回答1: Answering my own question: CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); CVPixelBufferLockBaseAddress(pixelBuffer, 0); // access the data int width = CVPixelBufferGetWidth(pixelBuffer); int height = CVPixelBufferGetHeight(pixelBuffer);

Does anyone know how to implement the AVAssetResourceLoaderDelegate methods correctly?

断了今生、忘了曾经 提交于 2019-12-03 04:36:37
问题 I am trying to coax AVFoundation to read from a custom URL. The custom URL stuff works. The code below creates a NSData with a movie file: NSData* movieData = [NSData dataWithContentsOfURL:@"memory://video"]; I've set up a AVAssetResourceLoader object using the following code: NSURL* url = [NSURL URLWithString:@"memory://video"]; AVURLAsset* asset = [[AVURLAsset alloc] initWithURL:url options:nil]; AVAssetResourceLoader* loader = [asset resourceLoader]; [loader setDelegate:self queue

Image/Text overlay in video swift

时光总嘲笑我的痴心妄想 提交于 2019-12-03 04:32:56
问题 I am work with image overlay for watermark effect in video using swift.I am using AVFoundation for this but somehow I am not succeed. Following is my code for overlay image/text let path = NSBundle.mainBundle().pathForResource("sample_movie", ofType:"mp4") let fileURL = NSURL(fileURLWithPath: path!) let composition = AVMutableComposition() var vidAsset = AVURLAsset(URL: fileURL, options: nil) // get video track let vtrack = vidAsset.tracksWithMediaType(AVMediaTypeVideo) let videoTrack

How do I add a still image to an AVComposition?

筅森魡賤 提交于 2019-12-03 04:30:50
问题 I have an AVMutableComposition with a video track and I would like to add a still image into the video track, to be displayed for some given time. The still image is simply a PNG. I can load the image as an asset, but that’s about it, because the resulting asset does not have any tracks and therefore cannot be simply inserted using the insertTimeRange… methods. Is there a way to add still images to a composition? It looks like the answer is somewhere in Core Animation, but the whole thing

Audio playback progress as UISlider in Swift

时光总嘲笑我的痴心妄想 提交于 2019-12-03 04:24:46
问题 I've seen some posts about accomplishing this in Objective-C but I've been unable to do the same via Swift. Specifically, I can't figure out how to implement addPeriodicTimeObserverForInterval in the below. var player : AVAudioPlayer! = nil @IBAction func playAudio(sender: AnyObject) { playButton.selected = !(playButton.selected) if playButton.selected { let fileURL = NSURL(string: toPass) player = AVAudioPlayer(contentsOfURL: fileURL, error: nil) player.numberOfLoops = -1 // play

AVFoundation - Reverse an AVAsset and output video file

早过忘川 提交于 2019-12-03 04:16:14
问题 I've seen this question asked a few times, but none of them seem to have any working answers. The requirement is to reverse and output a video file (not just play it in reverse) keeping the same compression, format, and frame rate as the source video. Ideally, the solution would be able to do this all in memory or buffer and avoid generating the frames into image files (for ex: using AVAssetImageGenerator ) and then recompiling it (resource intensive, unreliable timing results, changes in