avfoundation

Extending an AVMutableComposition

荒凉一梦 提交于 2019-12-12 13:29:22
问题 I would like to render some graphics to the end of my AVMutableComposition, like credits at the end of a movie. How can I create a blank asset that would extend the composition time to give me some blank space I can render to? 回答1: I found the answer. It lies in the insertEmptyTimeRange method. An example: //comp is an AVMutableComposition float secondsToExtend = 5.0f; long long timescale = comp.duration.timescale; CMTime endTime = CMTimeMake(comp.duration.value - 1, timescale); CMTime

“CIImage initWithCVPixelBuffer:options:” failed because its pixel format p422 is not supported in iOS 10

喜欢而已 提交于 2019-12-12 13:15:42
问题 In ios10 , I want to get video capture, but get an error "[CIImage initWithCVPixelBuffer:options:] failed because its pixel format p422 is not supported." My code is this: func previewImage(handle: (image: UIImage?) -> Void) { dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), { () -> Void in dispatch_async(dispatch_get_main_queue(), { () -> Void in guard let time = self.player?.currentTime() else { return } guard let pixelBuffer = self.output?

Using Swift with AVPlayer, how do you add and remove a video via code?

不羁岁月 提交于 2019-12-12 12:50:23
问题 I'm new to Swift and am trying to add a video to the view and then remove it when my "stopScreenSaver" notification is dispatched. All seems to work well except for when I go to remove the video layer (playerLayer.removeFromSuperlayer()). Any guidance would be appreciated. I feel like I'm missing some basic concept here for adding and removing the layer! import UIKit import AVFoundation import QuartzCore import CoreMedia class ViewController: UIViewController { let contentURL = NSBundle

Use avfoundation to capture image, but can not capture too quickly

邮差的信 提交于 2019-12-12 12:28:36
问题 I use avfoundation to capture images, but I can not capture too quickly(I set interval time to 0.1s). It says " NULL sample buffer". What is the problem? Thank you. [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) { CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL); if (exifAttachments) { // Do something with the attachments. //

How to change sample rate properly in Avfoundation

放肆的年华 提交于 2019-12-12 11:21:15
问题 I have done this simple program. what it does is it just record and play back the buffers simultaneously. Everything works fine if the sample rate is 44100 hz but if I change the sample rate to 16000 or 8000, it doesn't producing any sound at all or may be some white noise which is not audiable.Why is this happening? How can I record with different sample rate? Following code I have tried: import UIKit import AVFoundation class ViewController: UIViewController { var engine = AVAudioEngine()

AVPlayerLayer as SCNMaterial not rendered, audio playing fine

╄→гoц情女王★ 提交于 2019-12-12 10:43:30
问题 i'm trying to use a AVPlayerLayer as a SCNMaterial to be assigned to a SCNSphere. Based on: https://developer.apple.com/library/mac/samplecode/SceneKit_Slides_WWDC2013/Listings/Scene_Kit_Session_WWDC_2013_Sources_Slides_ASCSlideMaterialLayer_m.html I create a player, create a player layer and have tried with and without a backgroundLayer to assign as material for my SCNSphere Issue is, I get the same result reported here: SCNMaterialProperty not rendering layer Audio plays, video not rendered

Swift - Stop avaudioplayer

情到浓时终转凉″ 提交于 2019-12-12 10:14:00
问题 I am trying to build a soundboard into an app and have figured out an efficient way of using tags to control playing the sounds. However I am now trying to integrate a pause button that can be used with the .stop() method on the AVAudioPlayer however I get an error with my current code: EXC_BAD_ACCESS This is what I am using at the moment, any ideas? import UIKit import AVFoundation let soundFilenames = ["sound","sound2","sound3"] var audioPlayers = [AVAudioPlayer]() class

AVQueuePlayer playing several audio tracks in background iOS5

强颜欢笑 提交于 2019-12-12 09:18:55
问题 I used AVQueuePlayer to play several items in background. And code worked perfect in iOS4. And in iOS5 AVQueuePlayer changed its behavior, so player stops playing after first item is ended. Matt Gallagher wrote a hint in this post. "As of iOS 5, it appears that AVQueuePlayer no longer pre-buffers. It did pre-buffer the next track in iOS 4." So my question is how to play several items in background using AVPlayer or AVQueuePlayer in iOS5. 回答1: Matt Gallagher's answer in his blog: "You must

AVPlayer loading AVAsset from file that is appended simultaneously by external source (for macOS and iOS)

六月ゝ 毕业季﹏ 提交于 2019-12-12 08:23:44
问题 I have a question concerning the use of AVFoundation’s AVPlayer (probably applicable to both iOS and macOS). I am trying to playback audio (uncompressed wav) data that come from a channel other than the standard HTTP Live Streaming. The case: Audio data packets come compressed in a channel along with other data the app needs to work with. For example, video and audio come in the same channel and get separated by a header. After filtering, I get the audio data and decompress them to a WAV

Playing stream with bad internet in AVPlayer

拈花ヽ惹草 提交于 2019-12-12 08:06:12
问题 When AVPlayer plays an asset that comes from network stream, it pauses when reaches the end of downloaded content. So, my question is, how to know, that it stopped because of bad network? And how to play it after it downloads, let’s say, next 10 seconds of asset? 回答1: You can add an observer to when the AVPlayer gets empty buffer: [[self.tracksPlayer currentItem] addObserver:self forKeyPath:@"playbackBufferEmpty" options:NSKeyValueObservingOptionNew context:nil]; And an observer so you can