avfoundation

Create a CMSampleBuffer from a CVPixelBuffer

丶灬走出姿态 提交于 2019-12-01 06:25:35
I get a CVPixelBuffer from ARSessionDelegate: func session(_ session: ARSession, didUpdate frame: ARFrame) { frame.capturedImage // CVPixelBufferRef } But another part of my app (that I can't change) uses a CMSampleBuffer. CMSampleBuffer is a container of CVPixelBuffer. In order to create a CMSampleBuffer I can use this function: func CMSampleBufferCreateReadyWithImageBuffer(_ allocator: CFAllocator?, _ imageBuffer: CVImageBuffer, _ formatDescription: CMVideoFormatDescription, _ sampleTiming: UnsafePointer<CMSampleTimingInfo>, _ sBufOut: UnsafeMutablePointer<CMSampleBuffer?>) -> OSStatus The

iOS Frame by Frame video playback forward/backward

淺唱寂寞╮ 提交于 2019-12-01 06:20:35
问题 I'd like to show a video on an iOS device in slow motion. My view contains a video (~2 seconds long) and a slider. The user can move the slider and step (forwards and backwards) through the movie frame by frame. MPMoviePlayerController lacks the ability to step frame by frame. I read about MVAssetReader , but I have no concrete idea how to use this. I don't have a fixed framerate, so it should take this information out of the metadata of the video. I really need to show every frame. Can

How do I call CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer?

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-01 06:02:07
I'm trying to figure out how to call this AVFoundation function in Swift. I've spent a ton of time fiddling with declarations and syntax, and got this far. The compiler is mostly happy, but I'm left with one last quandary. public func captureOutput( captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection! ) { let samplesInBuffer = CMSampleBufferGetNumSamples(sampleBuffer) var audioBufferList: AudioBufferList var buffer: Unmanaged<CMBlockBuffer>? = nil CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(

didOutputSampleBuffer delegate not called

你说的曾经没有我的故事 提交于 2019-12-01 05:56:48
didOutputSampleBuffer function in my code was not called. I don't know why it happened. Here's the code: import UIKit import AVFoundation import Accelerate class ViewController: UIViewController { var captureSession: AVCaptureSession? var dataOutput: AVCaptureVideoDataOutput? var customPreviewLayer: AVCaptureVideoPreviewLayer? @IBOutlet weak var camView: UIView! override func viewWillAppear(animated: Bool) { super.viewDidAppear(animated) captureSession?.startRunning() //setupCameraSession() } override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view,

xcode - Add AVFountation Framework

时光怂恿深爱的人放手 提交于 2019-12-01 05:48:53
I'm creating an iphone app using xcode 4.2, and trying to use the AVFoundation Framework to play some radio stream. When i import it to the project's frameworks and the build, i get the following warning: ld: warning: ignoring file /Users/xanthos/Documents/tabbartest/AVFoundation.framework/AVFoundation, file was built for unsupported file format which is not the architecture being linked (i386) and of course when using anything of the framework (eg AVAudioSession) i get errors like: Undefined symbols for architecture i386: "_OBJC_CLASS_$_AVAudioSession", referenced from: objc-class-ref in

Error '!dat' trying to set the (null) audio devices' sample rate

守給你的承諾、 提交于 2019-12-01 05:42:38
I am trying to play an audio clip (using AVAudioPlayer) and a video clip (using MPMoviePlayerController), both of which were working. I then checked the files into SVN and pulled them down on another Mac. Now when I run the app (iPad app) when it tries to play either the audio or video, both give the error: Error '!dat' trying to set the (null) audio devices' sample rate Figuring that SVN corrupted the files, (even though the Mac's QuickLook will play them fine), I replaced them with the versions on the Mac where they still work. However I am still getting the error. All code is exactly the

AVMutableCompositionTrack - insertTimeRange - insertEmptyTimeRange issue

守給你的承諾、 提交于 2019-12-01 05:38:36
I have a strange problem: I want to generate a new sound file out of two soundfiles and silence. sound1: 2 seconds long + silence: 2 seconds silence + sound2: 2 seconds long When I try the code below, I get a 6 seconds long soundfile with all the parts, but in a different order! The order is: sound1, sound2, silence I am not able to put this silence in the middle of this composition (also not at the beginning). Is this a typical behavior or do I something wrong? Here is the code for putting the segments together: [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [audio1

AVCaptureSession addInput causing glitch in background audio

╄→尐↘猪︶ㄣ 提交于 2019-12-01 05:32:41
I'm making a video capturing iOS app and I want to be able to record audio from the microphone while allowing background music to play. I can do all of this but the background audio skips (pauses briefly) whenever the view with the camera enters and exits the foreground. I have isolated the bug to AVCaptureSession addInput : AVCaptureSession session = [[AVCaptureSession alloc] init]; session.automaticallyConfiguresApplicationAudioSession = NO; AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio]; AVCaptureDeviceInput *audioDeviceInput =

Create a CMSampleBuffer from a CVPixelBuffer

爱⌒轻易说出口 提交于 2019-12-01 04:53:33
问题 I get a CVPixelBuffer from ARSessionDelegate: func session(_ session: ARSession, didUpdate frame: ARFrame) { frame.capturedImage // CVPixelBufferRef } But another part of my app (that I can't change) uses a CMSampleBuffer. CMSampleBuffer is a container of CVPixelBuffer. In order to create a CMSampleBuffer I can use this function: func CMSampleBufferCreateReadyWithImageBuffer(_ allocator: CFAllocator?, _ imageBuffer: CVImageBuffer, _ formatDescription: CMVideoFormatDescription, _ sampleTiming:

AVCaptureSession addInput causing glitch in background audio

喜夏-厌秋 提交于 2019-12-01 04:44:03
问题 I'm making a video capturing iOS app and I want to be able to record audio from the microphone while allowing background music to play. I can do all of this but the background audio skips (pauses briefly) whenever the view with the camera enters and exits the foreground. I have isolated the bug to AVCaptureSession addInput : AVCaptureSession session = [[AVCaptureSession alloc] init]; session.automaticallyConfiguresApplicationAudioSession = NO; AVCaptureDevice *audioDevice = [AVCaptureDevice