avfoundation

AVPlayer loading AVAsset from file that is appended simultaneously by external source (for macOS and iOS)

大城市里の小女人 提交于 2019-12-04 01:24:24
I have a question concerning the use of AVFoundation’s AVPlayer (probably applicable to both iOS and macOS). I am trying to playback audio (uncompressed wav) data that come from a channel other than the standard HTTP Live Streaming. The case: Audio data packets come compressed in a channel along with other data the app needs to work with. For example, video and audio come in the same channel and get separated by a header. After filtering, I get the audio data and decompress them to a WAV format (does not contain headers at this stage). Once the data packets are ready (9600 bytes each for 24k,

AVMutableComposition resizing issue

时间秒杀一切 提交于 2019-12-04 00:57:53
I'm trying to render an image into a video captured with the front camera using AVMutableComposition. The size of the resulting video (including the image) is perfectly fine. However, the initial video will be resized as shown in this picture: I'm using the NextLevelSessionExporter and this is my code snippet: // * MARK - Creating composition /// Create AVMutableComposition object. This object will hold the AVMutableCompositionTrack instances. let mainMutableComposition = AVMutableComposition() /// Creating an empty video track let videoTrack = mainMutableComposition.addMutableTrack

AVAudioEngine crashes when I unplug headphones during a call

蓝咒 提交于 2019-12-04 00:13:49
Here is what I see in the log: 16:33:20.236: Call is Dialing 16:33:21.088: AVAudioSessionInterruptionNotification 16:33:21.450: AVAudioSessionRouteChangeNotification 16:33:21.450: ....change reason CategoryChange 16:33:21.539: AVAudioEngineConfigurationChangeNotification 16:33:21.542: Starting Audio Engine 16:33:23.863: AVAudioSessionRouteChangeNotification 16:33:23.863: ....change reason OldDeviceUnavailable 16:33:23.860 ERROR: [0x100a70000] AVAudioIONodeImpl.mm:317: ___ZN13AVAudioIOUnit11GetHWFormatEjPj_block_invoke: required condition is false: hwFormat *** Terminating app due to uncaught

How do I use AVAssetWriter?

五迷三道 提交于 2019-12-03 22:49:56
I’d like to take some video frames and encode them into a video. It looks like that’s exactly what AVAssetWriter was meant for, but no matter how I eyeball the docs and Google I can’t find any way to actually use it. From the docs it looks like I need an input ( AVAssetWriterInput ) to feed the writer from. Fine. But the AVAssetWriterInput class is abstract and the only subclass that I know of in 4.1 is AVAssetWriterInputPixelBufferAdaptor that requires a AVAssetWriterInput in its initializer…? Am I missing something obvious here? zoul Ah yes, I have to acquire an instance using +

Split CMSampleBufferRef containing Audio

大兔子大兔子 提交于 2019-12-03 22:25:03
问题 I'am splitting the recording into different files while recording... The problem is, captureOutput video and audio sample buffers doesn't correspond 1:1 (which is logical) - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection AUDIO START: 36796.833236847 | DURATION: 0.02321995464852608 | END: 36796.856456802 VIDEO START: 36796.842089239 | DURATION: nan | END: nan AUDIO START: 36796

CALayer - Place sublayer below storyboard UIButtons?

ε祈祈猫儿з 提交于 2019-12-03 22:10:49
I've got a view controller in my storyboard with several UIButtons. One of them activates an AVFoundation camera preview layer shown in a sublayer: captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session]; captureVideoPreviewLayer.frame = self.view.bounds; [self.view.layer addSublayer:captureVideoPreviewLayer]; This works correctly except for the fact that the preview layer is rendered on top of my buttons so even though the buttons are still clickable they are not able to be seen by the user. Is there an easy way to place the sublayer below the buttons? Or an

Generating a tone in iOS with 16 bit PCM, AudioEngine.connect() throws AUSetFormat: error -10868

这一生的挚爱 提交于 2019-12-03 21:34:30
I have the following code for generating an audio tone of given frequency and duration. It's loosely based on this answer for doing the same thing on Android (thanks: @Steve Pomeroy): https://stackoverflow.com/a/3731075/973364 import Foundation import CoreAudio import AVFoundation import Darwin class AudioUtil { class func play(frequency: Int, durationMs: Int) -> Void { let sampleRateHz: Double = 8000.0 let numberOfSamples = Int((Double(durationMs) / 1000 * sampleRateHz)) let factor: Double = 2 * M_PI / (sampleRateHz/Double(frequency)) // Generate an array of Doubles. var samples = [Double]

Can't play audio recorded from voice using AVCaptureAudioDataOutputSampleDelegate

空扰寡人 提交于 2019-12-03 21:13:42
问题 I have been googling and researching for days but I can't seem to get this to work and I can't find any solution to it on the internet. I am trying to capture my voice using the microphone and then playing it through the speakers. Here is my code: class ViewController: UIViewController, AVAudioRecorderDelegate, AVCaptureAudioDataOutputSampleBufferDelegate { var recordingSession: AVAudioSession! var audioRecorder: AVAudioRecorder! var captureSession: AVCaptureSession! var microphone:

AVCaptureSession with multiple orientations issue

半城伤御伤魂 提交于 2019-12-03 21:10:04
I am attempting to implement a barcode scanner. I have an AVCaptureSession that takes in video from AVCaptureDevice. I want to support all orientations. With the following code, when I run the app, everything is fine in the portrait orientation. However in landscape orientation, the view rotates but the video input does not. So I end up with a 90degree rotated video. When I implement -(NSUInteger)supportedInterfaceOrientations method, then everything gets locked to the portrait position. Can anyone tell me how I can fix this issue? - (void)viewDidLoad { [super viewDidLoad]; // Do any

How to detect max dB Swift

徘徊边缘 提交于 2019-12-03 21:08:53
I'm trying to detect dB on a iOS Device, however, I am new to AV audio foundation can't really get to figure it out. I have come across this post: iOS - Detect Blow into Mic and convert the results! (swift) , but it is not working for me. My current code is this: import Foundation import UIKit import AVFoundation import CoreAudio class ViewController: UIViewController { var recorder: AVAudioRecorder! var levelTimer = NSTimer() var lowPassResults: Double = 0.0 override func viewDidLoad() { super.viewDidLoad() //make an AudioSession, set it to PlayAndRecord and make it active var audioSession