avfoundation

AVFoundation audio processing using AVPlayer's MTAudioProcessingTap with remote URLs

故事扮演 提交于 2019-12-03 03:35:24
There is precious little documentation on AVAudioMix and MTAudioProcessingTap, which allow processing to be applied to the audio tracks (PCM access) of media assets in AVFoundation (on iOS). This article and a brief mention in a WWDC 2012 session is all I have found. I have got the setup described here working for local media files but it doesn't seem to work with remote files (namely HLS streaming URLs). The only indication that this is expected is the note at the end of this Technical Q&A : AVAudioMix only supports file-based assets. Does any one know more about this? is there really no way

iOS 7 AVCaptureMetadataOutput delegate (QRCode scanner)

我的梦境 提交于 2019-12-03 03:25:35
问题 I'm trying to implement a QRCode scanner with the new iOS 7 features but my code isn't calling the main AVCaptureMetadataOutputObjectsDelegate method. I've used the AVFoundation camera before and with my current implementation I've got the preview layer running without a problem. Even switching my output back to AVCaptureVideoDataOutput validates my session setup. I'm using this NSHipster post as a guideline and here's my code so far: Interface: @import AVFoundation; @interface

YouTube live on iOS?

时光毁灭记忆、已成空白 提交于 2019-12-03 03:17:57
问题 The docs are a little hard to parse here. I was wondering if there was any way to Stream YouTube live into an iOS app, without significant/any YouTube branding. Stream from an iOS device as a broadcast stream for YouTube live. My initial Googling turned up mixed responses. I was hoping to see an example of this if it's possible, or save myself some time if it's not. Suppose I have a person on ATT next to a person on Verizon streaming content, and I want to make both appear as a single

How to output a CIFilter to a Camera view?

浪子不回头ぞ 提交于 2019-12-03 03:08:30
I'm just starting out in Objective-C and I'm trying to create a simple app where it shows the camera view with a blur effect on it. I got the Camera output working with the AVFoundation framework. Now, I'm trying to hook up the Core image framework but to no knowledge how to, Apple documentation is confusing for me and searching for guides and tutorials online leads to no results. Thanks in advance for the help. #import "ViewController.h" #import <AVFoundation/AVFoundation.h> @interface ViewController () @property (strong ,nonatomic) CIContext *context; @end @implementation ViewController

AVAssetWriter How to write down-sampled/compressed m4a/mp3 files

為{幸葍}努か 提交于 2019-12-03 03:04:57
I'm trying to take a local m4a or mp3 file and compress/down-sample this file (for the purposes of making a smaller file). Originally, I was using the AVAssetExportSession to export an AVAsset to a temp directory, but I didn't have any control over compression/down-sampling (you can only use presets, which of them, only .wav file formats support quality degradation). Then, following several examples here on SO, I tried using AVAssetReader/AVAssetWriter to preform this 'export'. I create my reader/writer as such: NSString *exportPath = [NSHomeDirectory() stringByAppendingPathComponent:@"out.m4a

AVCaptureVideoPreviewLayer landscape orientation

陌路散爱 提交于 2019-12-03 02:19:08
How can I get 'AVCaptureVideoPreviewLayer' to display properly in landscape orientation? It works fine in portrait but doesn't rotate, and shows a rotated camera capture when the parent view controller is in landscape orientation. First, the answer - (void)viewWillLayoutSubviews { _captureVideoPreviewLayer.frame = self.view.bounds; if (_captureVideoPreviewLayer.connection.supportsVideoOrientation) { _captureVideoPreviewLayer.connection.videoOrientation = [self interfaceOrientationToVideoOrientation:[UIApplication sharedApplication].statusBarOrientation]; } } - (AVCaptureVideoOrientation

How can I do fast image processing from the iPhone camera?

好久不见. 提交于 2019-12-03 02:11:01
问题 I am trying to write an iPhone application which will do some real-time camera image processing. I used the example presented in the AVFoundation docs as a starting point: setting a capture session, making a UIImage from the sample buffer data, then drawing an image at a point via -setNeedsDisplay , which I call on the main thread. This works, but it is fairly slow (50 ms per frame, measured between -drawRect: calls, for a 192 x 144 preset) and I've seen applications on the App Store which

AVCaptureStillImageOutput vs AVCapturePhotoOutput in Swift 3

浪尽此生 提交于 2019-12-03 02:08:40
I am trying to simply put a Camera View in my View Controller. I imported AVFoundation at the top, as well as UIImagePickerControllerDelegate and UINavigationControllerDelegate classes. However, whenever I try to use AVCaptureStillImageOutput , Xcode tells me that it was deprecated in iOS10 and I should use AVCapturePhotoOutput . That is completely fine, however, as soon as I want to call stillImageOutput.outputSettings , .outputSettings itself is not available. Thus, I have to use AVAVCaptureStillImageOutput for it to work but I have multiple warnings because this function was deprecated in

AVSpeechSynthesizer error AudioSession

£可爱£侵袭症+ 提交于 2019-12-03 01:38:02
I'm playing around with AVSpeechSynthesizer and always getting these errors: ERROR: >aqsrv> 65: Exception caught in (null) - error -66634 ERROR: AVAudioSessionUtilities.h:88: GetProperty_DefaultToZero: AudioSessionGetProperty ('disa') failed with error: '?ytp' My code is: AVSpeechSynthesizer *synthesizer = [[AVSpeechSynthesizer alloc] init]; [synthesizer setDelegate:self]; speechSpeed = AVSpeechUtteranceMinimumSpeechRate; AVSpeechUtterance *synUtt = [[AVSpeechUtterance alloc] initWithString:[[self text] text]]; [synUtt setRate:speechSpeed]; [synUtt setVoice:[AVSpeechSynthesisVoice

CoreAnimation, AVFoundation and ability to make Video export

邮差的信 提交于 2019-12-03 01:32:34
问题 I'm looking for the correct way to export my pictures sequence into a quicktime video. I know that AV Foundation have the ability to merge or recombine videos and also to add audio track building a single video Asset. Now ... my goal is a little bit different. I would to create a video from scratch. I have a set of UIImage and I need to render all of them in a single video. I read all the Apple Documentation about AV Foundation and i found the AVVideoCompositionCoreAnimationTool class that