avfoundation

iOS Background audio recording

别来无恙 提交于 2019-12-04 06:04:54
I know that if I start an audio recording session in foreground, with Audio, Airplay, and Picture in Picture activated in Capabilities -> Background Modes; I am able to continue to record even in background, but only if I start the recording session in foreground and then I go in background. My problem is that I want to start the voice recording session from background, which might seem shaddy and not what Apple wants, but the use case is like this: I have a Bluetooth LE device with buttons and an iOS app. Those two are paired (Bluetooth LE device and the iPhone which runs the iOS app) and the

Paste String Data from UIPastboard

牧云@^-^@ 提交于 2019-12-04 05:54:58
问题 I made an IOS app that using OCR to scan barcodes and copies the data to UserDefaults.standard.value(forKey: "barcodeId") as! String from a receipt. I'm trying to auto-populate the data taken from I've managed to pass the data to the element ID, But the data scanned using OCR doesn't populate the correct value. Below information from the console. NOTE: I've obscured the URL, Since It's a private URL. Optional(8.906010283900207e+17) is what I need to pass to the element ID on the webbased form

CVPixelBufferLockBaseAddress why? Capture still image using AVFoundation

隐身守侯 提交于 2019-12-04 05:27:38
I'm writing an iPhone app that creates still images from the camera using AVFoundation. Reading the programming guide I've found a code that does almost I need to do, so I'm trying to "reverse engineering" and understand it. I'm founding some difficulties to understand the part that converts a CMSampleBuffer into an image. So here is what I understood and later the code. The CMSampleBuffer represent a buffer in the memory where the image with additional data is stored. Later I call the function CMSampleBufferGetImageBuffer() to receive a CVImageBuffer back with just the image data. Now there

CVPixelBuffer to CIImage always returning nil

我们两清 提交于 2019-12-04 05:15:23
I am trying to convert a pixelBuffer extracted from AVPlayerItemVideoOutput to CIImage but always getting nil. The Code if([videoOutput_ hasNewPixelBufferForItemTime:player_.internalPlayer.currentItem.currentTime]) { CVPixelBufferRef pixelBuffer = [videoOutput_ copyPixelBufferForItemTime:player_.internalPlayer.currentItem.currentTime itemTimeForDisplay:nil]; CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer]; // Always image === nil CIFilter *filter = [FilterCollection filterSepiaForImage:image]; image = filter.outputImage; CIContext *context = [CIContext contextWithOptions:nil];

AVfoundation Reverse Video

我的梦境 提交于 2019-12-04 05:14:36
I tried to make video in reverse. While playing asset in AVPlayer i set the rate = -1 to make it work in reverse format. But how to export that video? I looked into docs. read about avassetwrite, sambuffers , compositions but didn't find any way to do this. Below links are refered by me http://www.raywenderlich.com/13418/how-to-play-record-edit-videos-in-ios Reverse video demo - https://github.com/mikaelhellqvist/ReverseClip Above example to no longer works in IOS 8. and even it is not reversing audio. If anyone give me little hint on it then i can do further. I am stuck up here from last 5

Error Domain=AVFoundationErrorDomain Code=-11800 “The operation could not be completed” {Error Domain=NSOSStatusErrorDomain Code=-16976 “(null)”}

橙三吉。 提交于 2019-12-04 04:59:47
问题 I am working on Video application in Swift3 iOS. Basically I have to merged the Video Assets and Audios into one with Fade Effect and save this to iPhone gallery. To achieve this, I am using below method: private func doMerge(arrayVideos:[AVAsset], arrayAudios:[AVAsset], animation:Bool, completion:@escaping Completion) -> Void { var insertTime = kCMTimeZero var audioInsertTime = kCMTimeZero var arrayLayerInstructions:[AVMutableVideoCompositionLayerInstruction] = [] var outputSize = CGSize

CMBlockBufferCreate memory management

[亡魂溺海] 提交于 2019-12-04 04:59:24
I have some code that creates CMBlockBuffers and then creates a CMSampleBuffer and passes it to an AVAssetWriterInput. What's the deal on memory management here? According to the Apple documentation, anything you use with 'Create' in the name should be released with CFRelease . However, if I use CFRelease then my app aborts with 'malloc: * error for object 0xblahblah: pointer being freed was not allocated. CMBlockBufferRef tmp_bbuf = NULL; CMBlockBufferRef bbuf = NULL; CMSampleBufferRef sbuf = NULL; status = CMBlockBufferCreateWithMemoryBlock( kCFAllocatorDefault, samples, buflen,

Progressive Video Download on iOS

徘徊边缘 提交于 2019-12-04 04:45:36
I am trying to implement Progressive Downloading of a video in my iOS application that can be played through AVPlayer. I have already implemented a downloader module that can download the files to the iPad. However, I have discovered I cannot play a file that is still being written to So, as far as I can tell, my only solution would be through downloading a list of file 'chunks' and then keep playing through every file as they are ready (ie: downloaded), probably using HLS Searching I have come across this question which implements the progressive download through hls but other than that, I

AVPlayerLayer shows black screen but sound is working

混江龙づ霸主 提交于 2019-12-04 04:41:34
问题 Im trying to display a local recorded video in a AVPlayerLayer which works sometimes . I can hear the audio from the recorded video but can't see the video. Sometimes both video and audio is working, sometimes only audio. I've tried both with a AVPlayerLayer and AVPlayerViewController but the same issue occurs in both cases. So it's not because of the frames being wrong. Example code AVPlayerViewController : let player = AVPlayer(url: url) let playerController = AVPlayerViewController()

AVCaptureSession for audio in simulator

孤人 提交于 2019-12-04 04:11:45
I'm trying to capture audio, using the method in this question ; with AVCaptureSession and AVCaptureAudioDataOutput. This seems to work fine with 1 inconvenience: it doesn't work in the simulator. Both AVAudioRecorder, and the good old SpeakHere demo app, work fine in the simulator, using the internal microphone on my MacBook Pro. Problem is that [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio] gives null in the simulator, so subsequent code fails with the message (when it tries to add null as input to the AVCaptureSession): *** Terminating app due to uncaught exception