avfoundation

Generating thumbnails from videos created using AVAssetWriter

隐身守侯 提交于 2020-01-14 10:13:07
问题 I'm recording video+audio using AVAssetWriter and the captureOutput callback. The problem is that in about 5% of the cases I cannot generate a thumbnail from second zero(If I try to go further in the video, then there is no problem). The error I'm getting is: Error Domain=AVFoundationErrorDomain Code=-11832 "Cannot Open" UserInfo=0x4b31b0 {NSLocalizedFailureReason=This media cannot be used., NSUnderlyingError=0x4effb0 "The operation couldn’t be completed. (OSStatus error -12431.)",

Converting a Swift UnsafePointer<AudioStreamBasicDescription> to a Dictionary?

半世苍凉 提交于 2020-01-14 05:34:11
问题 I want to know how to create a dictionary of [String:AnyObject] from an UnsafePointer<AudioStreamBasicDescription> I guess I don't understand how to work with an UnsafePointer<T> in Swift . Here's where I'm starting from - The AVAudioFile class has a fileFormat property which is of AVAudioFormat Type. AVAudioFormat has a streamDescription property which returns an UnsafePointer<AudioStreamBasicDescription> as a read-only property. I'd like to see what the values are in this struct and

How to play multiple sounds from buffer simultaneously using nodes connected to AVAudioEngine's mixer

若如初见. 提交于 2020-01-14 04:28:07
问题 I am making a basic music app for iOS, where pressing notes causes the corresponding sound to play. I am trying to get multiple sounds stored in buffers to play simultaneously with minimal latency. However, I can only get one sound to play at any time. I initially set up my sounds using multiple AVAudioPlayer objects, assigning a sound to each player. While it did play multiple sounds simultaneously, it didn't seem like it was capable of starting two sounds at the same time (it seemed like it

How to instantly see new video layer after letting go of button?

元气小坏坏 提交于 2020-01-13 11:29:10
问题 I have an app where a user can hold a button to take a video. However when they do so and then the new layer, with the video playback, does not appear instantly. Instead, there is a very short delay where you can see the camera still showing what the camera sees after the user has let go of the button. When the delay finishes the video instantly shows up and starts playing. But how can I instead make the first frame of the video appear before its ready to play so that its there just for a

BGRA on iPhone glTexImage2D and glReadPixels

别说谁变了你拦得住时间么 提交于 2020-01-13 07:04:06
问题 Looking at the docs, I should be able to use BGRA for the internal format of a texture. I am supplying the texture with BGRA data (using GL_RGBA8_OES for glRenderbufferStorage as it seems BGRA there is not allowed). However, the following does not work: glTexImage2D(GL_TEXTURE_2D, 0, **GL_BGRA**, w, h, 0, GL_BGRA, GL_UNSIGNED_BYTE, buffer); ... glReadPixels(0, 0, w,h, GL_BGRA, GL_UNSIGNED_BYTE,buffer, 0); While this gives me a black frame: glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h, 0, GL

BGRA on iPhone glTexImage2D and glReadPixels

ぐ巨炮叔叔 提交于 2020-01-13 07:04:03
问题 Looking at the docs, I should be able to use BGRA for the internal format of a texture. I am supplying the texture with BGRA data (using GL_RGBA8_OES for glRenderbufferStorage as it seems BGRA there is not allowed). However, the following does not work: glTexImage2D(GL_TEXTURE_2D, 0, **GL_BGRA**, w, h, 0, GL_BGRA, GL_UNSIGNED_BYTE, buffer); ... glReadPixels(0, 0, w,h, GL_BGRA, GL_UNSIGNED_BYTE,buffer, 0); While this gives me a black frame: glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h, 0, GL

Switching Camera with a button in Swift

假装没事ソ 提交于 2020-01-13 03:51:28
问题 This seems to work to switch the camera from the back to the front, but I'm trying to come up with an 'if' statement so that I can switch it back too. Any ideas or advice? @IBAction func didTouchSwitchButton(sender: UIButton) { let camera = getDevice(.Front) let cameraBack = getDevice(.Back) do { input = try AVCaptureDeviceInput(device: camera) } catch let error as NSError { print(error) input = nil } if(captureSession?.canAddInput(input) == true){ captureSession?.addInput(input)

AVCaptureOutput didOutputSampleBuffer stops getting called

梦想与她 提交于 2020-01-13 03:12:49
问题 I have an issue with the delegate method didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection of AVCaptureOutput . It stops getting called within a second or two when I'm adding the sampleBuffer to a CFArray . If I remove the CFArray code, the delegate method continues to get called so I have no idea why the CFArray code is causing it to stop. I'd appreciate any help. @property CFMutableArrayRef sampleBufferArray; - (void)captureOutput:

CMBlockBufferCreate memory management

别说谁变了你拦得住时间么 提交于 2020-01-12 17:35:28
问题 I have some code that creates CMBlockBuffers and then creates a CMSampleBuffer and passes it to an AVAssetWriterInput. What's the deal on memory management here? According to the Apple documentation, anything you use with 'Create' in the name should be released with CFRelease. However, if I use CFRelease then my app aborts with 'malloc: * error for object 0xblahblah: pointer being freed was not allocated. CMBlockBufferRef tmp_bbuf = NULL; CMBlockBufferRef bbuf = NULL; CMSampleBufferRef sbuf =

Accessing multiple audio hardware outputs/channels using AVFoundation and Swift

喜欢而已 提交于 2020-01-12 11:12:39
问题 How do I access additional audio hardware outputs other than 1-2 using AVFoundation? I'm writing swift code for a Mac OS-X app which plays mp3 files through various output devices (USB interface, dante, soundflower) which looks like the following: myPlayer = AVPlayer(URL: myFilePathURL) myPlayer.audioOutputDeviceUniqueID = myAudioOutputDevices[1].deviceUID() myPlayer.play() But, I'm not sure how to play the audio file to channels other than just 1-2. For instance I'd like to play an mp3 to