avfoundation

Swift: Error loading /Library/Audio/Plug-Ins/HAL/NRDAudioClient: no suitable image found

若如初见. 提交于 2019-12-25 03:32:53
问题 I am trying to play a sound as follows: import AVFoundation let sound = URL(fileURLWithPath: Bundle.main.path(forResource: "audiofile", ofType: "wav")!) var audioPlayer = AVAudioPlayer() @IBAction func audio1(_ sender: Any) { do { audioPlayer = try AVAudioPlayer(contentsOf: sound) audioPlayer.play() } catch { // error } When running in the iOS simulator this results in the following: 2019-01-08 12:29:55.438490+0800 Test App[8096:118590] Error loading /Library/Audio/Plug-Ins/HAL/NRDAudioClient

AVFoundation isn’t reading a specific type of barcode

喜你入骨 提交于 2019-12-25 03:27:21
问题 I’m building an app for paying parking tickets. The problem is that we have a specific type of barcode that AVFoundation can’t read. I tried to found other frameworks to replace the AVFoundation, but turns out all of them use AVFoundation underneath. The only framework I found capable of reading it is the ZBar, but we’re not planning to use it since it hasn’t been updated for 6 years. Do you guys have any idea why it isn’t working? Image link https://i.stack.imgur.com/AEU0H.jpg Thank you.

Record video from front facing camera during ARKit ARSession on iPhone X

半腔热情 提交于 2019-12-25 01:34:55
问题 I'm using an ARSession combined with an ARFaceTrackingConfiguration to track my face. At the same time, I would like to record a video from the front facing camera of my iPhone X. To do so I'm using AVCaptureSession but as soon as I start recording, the ARSession gets interrupted. These are two snippets of code: // Face tracking let configuration = ARFaceTrackingConfiguration() configuration.isLightEstimationEnabled = false let session = ARSession() session.run(configuration, options: [

Saving and accessing audio recordings

情到浓时终转凉″ 提交于 2019-12-25 01:14:36
问题 Ok so im kind of new to objective-c. I set up an audio recorder using AVAudioRecorder. It works great. but.... i dont know how to access the saved files, heck i dont even know if they get saved. The program is saving them to NSDocumentsDirectory under the name "sounds.caf" at least temporarily. I want to know if these are being permanently saved, if not, then how can i make it so. And then how can i access them after they are saved. Thanks for the help. Also as a side note id like to know if

AVAudioSession ducking stops when AVAssetWriter.startWriting is called

偶尔善良 提交于 2019-12-25 00:51:02
问题 When I call startWriting for a video capture session, the ducking for my audio session stops, and the external audio goes back to full volume. How can I prevent this from happening? Audio session setup: try! AVAudioSession.sharedInstance().setCategory(.playback, mode: .spokenAudio, options: [.duckOthers, .interruptSpokenAudioAndMixWithOthers]) try! AVAudioSession.sharedInstance().setActive(true) // ducking starts here, as expected Video session setup: let videoSettings:[String: Any] =

Audio output is not working after the screen recording using AVCapture session

杀马特。学长 韩版系。学妹 提交于 2019-12-24 22:09:56
问题 I am using the AVCapturesession to capture the macOS screen. When I connect my external headphone for audio input for recording the audio device is not release after finishing the recording. I am stopping the session completely after the recording. Still, I can't play any audio in the system. When I start the recording again audio output started playing. Please find the code below. I am using the code same as written in aperture library. import AVFoundation enum ApertureError: Error { case

AVAssetExportSession AVFoundationErrorDomain Code -11800 The operation could not be completed, NSOSStatusErrorDomain Code=-12780 "(null) in Swift iOS

老子叫甜甜 提交于 2019-12-24 21:09:59
问题 I am developing a Video based Application in Swift. Where I am exporting a Video clip with Watermark logo and Fade In Out effect. Here is my code: func watermark(video videoAsset:AVAsset, videoModal:VideoModel, watermarkText text : String!, imageName name : String!, saveToLibrary flag : Bool, watermarkPosition position : PDWatermarkPosition, withMode mode: SpeedoVideoMode, completion : ((_ status : AVAssetExportSessionStatus?, _ session: AVAssetExportSession?, _ outputURL : URL?) -> ())?) {

iPhone 5 front camera - tap to focus?

梦想与她 提交于 2019-12-24 16:32:00
问题 Using the AVFoundation framework, I have tap to focus using this code: - (void) autoFocusAtPoint:(CGPoint)point{ NSArray *devices = [AVCaptureDevice devices]; for (AVCaptureDevice *device in devices) { [device unlockForConfiguration]; if ([device hasMediaType:AVMediaTypeVideo]) { if([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) { if([device lockForConfiguration:nil]) { [device setFocusPointOfInterest:point]; [device setFocusMode

Audio equivalent of SPS and PPS when muxing Annex B MPEG-TS? What is “DecoderInfo”?

亡梦爱人 提交于 2019-12-24 15:33:04
问题 I'm using the Bento4 library to mux an Annex B TS (MPEG-2 transport stream) file with my h264 video and AAC audio streams that are being generated from VideoToolbox and AVFoundation respectively, as source data for a HLS (HTTP Live Streaming) stream. This question is not necessarily Bento4-specific: I'm trying to understand the underlying concepts so that I can accomplish the task, preferably by using Apple libs. So far, I've figured out how to create an AP4_AvcSampleDescription by getting

Video Buffer Output with Swift

此生再无相见时 提交于 2019-12-24 13:12:50
问题 My goal is to take the video buffer and ultimately convert it to NSData but I do not understand how to access the buffer properly. I have the captureOutput function but I have not been successful if converting the buffer and I'm not sure I am actually collecting anything in the buffer. This is all using swift code, I have found some examples using Objective-C but I am not able to understand the Obj-c code well enough to figure it out. var captureDevice : AVCaptureDevice? var