avfoundation

Haptic feedback not playing nice with AVFoundation? (UIImpactFeedbackGenerator, etc)

 ̄綄美尐妖づ 提交于 2019-12-08 01:42:05
问题 I'm trying to have a video / camera view in the background while I also allow for haptic feedback in my app for various actions, but it seems that AVFoundation is not playing nice with any of the calls I am making that involve the haptic calls: if #available(iOS 10.0, *) { let generator = UIImpactFeedbackGenerator(style: .light) generator.prepare() generator.impactOccurred() // More: let feedbackGenerator = UISelectionFeedbackGenerator() feedbackGenerator.selectionChanged() } Haptic feedback

Scanning barcode with AVCaptureMetadataOutput and AVFoundation

喜夏-厌秋 提交于 2019-12-08 01:32:48
问题 I am using AVFoundation and AVCaptureMetadataOutput to scan a QR barcode in iOS7, I present a view controller which allows the user to scan a barcode. It is working fine, ie. a barcode is being scanned and I can output the barcode string to console. But it keeps scanning over and over again, see screen shot. What I want it to do is scan the barcode just the once and then dismissViewController. Here is my code for the delegate method: - (void)captureOutput:(AVCaptureOutput *)captureOutput

Speech Synthesis on iOS weird errors on loading, and no concurrency

∥☆過路亽.° 提交于 2019-12-08 00:01:52
问题 I'm using the speech synth in AVFoundation, creating an instance of a voice like this: import AVFoundation class CanSpeak { let voices = AVSpeechSynthesisVoice.speechVoices() let voiceSynth = AVSpeechSynthesizer() var voiceToUse: AVSpeechSynthesisVoice? init(){ for voice in voices { if voice.name == "Arthur" { voiceToUse = voice } } } func sayThis(_ phrase: String){ let utterance = AVSpeechUtterance(string: phrase) utterance.voice = voiceToUse utterance.rate = 0.5 voiceSynth.speak(utterance)

Routing audio input to receive from TOP microphone on iPhone

半城伤御伤魂 提交于 2019-12-07 21:45:03
问题 I am writing a little app to record multiple tracks and play them back over one another. I am using the PlaybackAndRecord mode and i am routing my output to the main speakers. Problem is that the bottom microphone is still being used for input as well so now I when I record I get the output from the other tracks really loud on the new track. Here is what I have so far: audioSession = [AVAudioSession sharedInstance]; [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];

IOS Toggle AVFoundation Camera

荒凉一梦 提交于 2019-12-07 21:10:00
问题 In my App I'm capturing images using AVFoundation I made a button to switch between front and back cameras but it won't work. Here's the code I used : if (captureDevice.position == AVCaptureDevicePositionFront) { for ( AVCaptureDevice *device in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] ) { if ( device.position == AVCaptureDevicePositionBack) { NSError * error; AVCaptureDeviceInput * newDeviceInput = [[AVCaptureDeviceInput alloc]initWithDevice:device error:&error];

Accesing Individual Frames using AV Player

眉间皱痕 提交于 2019-12-07 18:52:42
问题 In a recent project, I have to access all frames of my video individually using AV Foundation. Also, if possible to acess them randomly (like an array) I tried to research the question but I didn't get anything useful. Note: Is there any useful documentation to get familiar with the AV Foundation ? 回答1: You can enumerate the frames of your video serially using AVAssetReader , like this: let asset = AVAsset(URL: inputUrl) let reader = try! AVAssetReader(asset: asset) let videoTrack = asset

AVCaptureVideoPreviewLayer Not Filling Bounds

隐身守侯 提交于 2019-12-07 18:48:23
问题 I have a AVCaptureVideoPreviewLayer and I want to fill that to the bounds of my UIView. It doesn't seem to be working correctly, though. It only shows on around 3/4 of the screen. CGRect bounds=self.previewView.layer.bounds; self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; self.previewLayer.bounds = bounds; self.previewLayer.position=CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds)); [self.previewView.layer addSublayer:self.previewLayer]; Image: http://cl.ly/image

IOS can I use AVAudioPlayer on the appDelegate?

流过昼夜 提交于 2019-12-07 17:32:07
问题 I have a TabBarController with two tabs and I want to play music on both tabs. Right now I have my code on the main appDelegate NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"My Song" ofType:@"m4a"]]; // My Song.m4a NSError *error; self.audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error]; if (error) { NSLog(@"Error in audioPlayer: %@", [error localizedDescription]); } else { //audioPlayer.delegate = self; [audioPlayer prepareToPlay]; }

How do I make AVCaptureSession and AVPlayer respect AVAudioSessionCategoryAmbient?

陌路散爱 提交于 2019-12-07 17:13:45
问题 I'm making an app that records ( AVCaptureSession ) and plays ( AVPlayerLayer ) video. I'd like to be able to do this without pausing background audio from other apps and I'd like the playback to respect the mute switch. In the AppDelegate I have set AVAudioSessionCategoryAmbient , according to the docs this should: The category for an app in which sound playback is nonprimary—that is, your app can be used successfully with the sound turned off. This category is also appropriate for “play

How to add multiple CALayer to a video file using AVMutableComposition and CALayers on iOS

隐身守侯 提交于 2019-12-07 16:44:16
问题 I want to add multiple CALayer one after another by time sequence.I can add one layer to video file using this link Here. Now my question is that how can i add multiple CALayer to video file. Thanks in advance.. 回答1: Most straightforward way is to bundle several layers into single layers. You will have to add instructions to add it at some point and remove when not needed. Something like this: CABasicAnimation *fadeAnimation = [CABasicAnimation animationWithKeyPath:@"opacity"]; fadeAnimation