avfoundation

AVFoundation iOS 5

萝らか妹 提交于 2019-11-30 01:10:27
问题 My apps runing on the appstore are using mp3 and video files that don't work since iOS5 update . I've installed xcode 4.2 and... When I test in the iPhone 5 Simulator or device I get the following error (for audio or video files): Error loading System/Library/Extensions/AudioIPCDriver.kext/Contents/Resources/AudioIPCPlugIn.bundle/Contents/MacOS/AudioIPCPlugIn: dlopen(/System/Library/Extensions/AudioIPCDriver.kext/Contents/Resources/AudioIPCPlugIn.bundle/Contents/MacOS/AudioIPCPlugIn, 262):

Combining two .caf files on iPhone

余生颓废 提交于 2019-11-30 01:01:57
I've looked and looked for an answer, but can't seem to find one. Lots have asked, but none have gotten answers. I have an app that records audio using AVAudioRecorder. Now I just want to merge two or more recordings into one file that can be sent out via email. Does anyone have any clue as to how this can be done? ( This answer suggests using something called Audio Service Queues, but I don't know anything about that) It's not quite as easy as you would think. I used the AVFoundation framework to do exactly what you're asking about to create iAmRingtones . It required creating AVAssets from

Detecting edges of a card with rounded corners

我们两清 提交于 2019-11-30 00:40:51
Hi currently i am working on an OCR reading app where i have successfully able to capture the card image by using AVFoundation framework. For next step, i need to find out edges of the card , so that i can crop the card image from main captured image & later i can sent it to OCR engine for processing. The main problem is now to find the edges of the card & i am using below code(taken from another open source project) which uses OpenCV for this purpose.It is working fine if the card is pure rectangular Card or Paper. But when i use a card with rounded corner (e.g Driving License), it is failed

Swift IOS Record Video and Audio with AVFoundation

泄露秘密 提交于 2019-11-30 00:03:15
问题 I was able to successfully grab the recorded video by following this question here Basically Inherit from AVCaptureFileOutputRecordingDelegate prototype Loop through available devices Creating a session with the camera Start Recording Stop Recording Get the Record video by implementing above prototype's method But the file doesn't comes with the audio. According to this question, i have to record audio separately and merge the video and audio using mentioned classes But i have no idea how to

Overlay Two Videos with AVFoundation

[亡魂溺海] 提交于 2019-11-29 22:45:13
问题 I am trying to overlay two videos, with the foreground video being somewhat alpha transparent. I have been following the Apple Docs as well as This tutorial. Whenever I try putting two of the same video through my code it doesn't crash; however, when I try feeding it two different videos I receive this error: VideoMaskingUtils.exportVideo Error: Optional(Error Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo={NSLocalizedDescription=Operation Stopped,

How to save a TIFF photo from AVFoundations captureStillImageAsynchronouslyFromConnection to a file with EXIF metadata on an iPhone (iOS)?

对着背影说爱祢 提交于 2019-11-29 22:43:43
问题 With this question I only ask for the possibilities I have with Xcode and iOS without external libraries. I am already exploring the possibility of using libtiff in another question. Problem I have been sieving stack overflow for weeks and found working solutions for every one of my problems on its own. I have 4 things that need to work: I need the RGBA data as it comes from the camera, no compression whatsoever I need as much metadata as possible, especially EXIF I need to save in TIFF

Capturing zoomed preview view in AVFoundation

做~自己de王妃 提交于 2019-11-29 22:32:08
I am working with zoom functionality in AVFoundation camera, i have implemented zoom by scaling the view that has AVCaptureVideoPreviewLayer. Now i want to capture the zoomed image. here is my code for adding AVFoundationVideoPreviewLayer to view: // create a uiview subclass for showing the camera feed UIView *previewView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 320, 430)]; [[self view] addSubview:previewView]; CGRect layerRect = CGRectMake(0, 0, 320, 430); [[self avCaptureVideoPreviewLayer] setBounds:layerRect]; [[self previewLayer] setPosition:CGPointMake(CGRectGetMidX(layerRect),

AVAudioSession setCategory Swift 4.2 iOS 12 - Play Sound on Silent

流过昼夜 提交于 2019-11-29 22:07:01
To play sound even on Silent mode I use to use below method. But how it's not working. // Works on Swift 3 do { try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback) } catch { print(error) } How to get it work in 4.2 / iOS 12? In newer version we need to set mode and options. try AVAudioSession.sharedInstance().setCategory( <#T##category:AVAudioSession.Category##AVAudioSession.Category#>, mode: <#T##AVAudioSession.Mode#>, options: <#T##AVAudioSession.CategoryOptions#>)` Her der Töne's comment shows you the new syntax, but you also need to activate the audio session

How to create video from its frames iPhone

风格不统一 提交于 2019-11-29 21:56:53
I had done R&D and got success in how to get frames in terms of images from video file played in MPMoviePlayerController . Got all frames from this code, and save all images in one Array. for(int i= 1; i <= moviePlayerController.duration; i++) { UIImage *img = [moviePlayerController thumbnailImageAtTime:i timeOption:MPMovieTimeOptionNearestKeyFrame]; [arrImages addObject:img]; } Now the question is that, After change some image file, like adding emotions to the images and also adding filters, such as; movie real, black and white, How can we create video again and store the same video in

ios AVFoundation tap to focus

烈酒焚心 提交于 2019-11-29 21:30:23
I am trying to create a camera app which, would act like the default camera app more or less. The thing, which is not working for me at the moment, is tap to focus. I want the camera to focus and do whatever it does on my touched point, just like the real camera app does. Here's my viewDidLoad - (void)viewDidLoad { [super viewDidLoad]; // Session _session = [[AVCaptureSession alloc] init]; _session.sessionPreset = AVCaptureSessionPresetPhoto; // Input _videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; _videoInput = [AVCaptureDeviceInput deviceInputWithDevice: