avfoundation

How to output a CIFilter to a Camera view?

若如初见. 提交于 2019-12-20 14:44:49
问题 I'm just starting out in Objective-C and I'm trying to create a simple app where it shows the camera view with a blur effect on it. I got the Camera output working with the AVFoundation framework. Now, I'm trying to hook up the Core image framework but to no knowledge how to, Apple documentation is confusing for me and searching for guides and tutorials online leads to no results. Thanks in advance for the help. #import "ViewController.h" #import <AVFoundation/AVFoundation.h> @interface

AVFoundation audio processing using AVPlayer's MTAudioProcessingTap with remote URLs

被刻印的时光 ゝ 提交于 2019-12-20 14:18:06
问题 There is precious little documentation on AVAudioMix and MTAudioProcessingTap, which allow processing to be applied to the audio tracks (PCM access) of media assets in AVFoundation (on iOS). This article and a brief mention in a WWDC 2012 session is all I have found. I have got the setup described here working for local media files but it doesn't seem to work with remote files (namely HLS streaming URLs). The only indication that this is expected is the note at the end of this Technical Q&A:

Play AVMutableComposition with AVPlayer?

限于喜欢 提交于 2019-12-20 12:49:10
问题 I'm trying to get two videos to play sequentially. I've tried AVQueuePlayer but there's a huge "burp" between the two clips. I need to them to play without interruption. So I'm trying to use AVMutableComposition and an AVPlayer but can't get it right. Here's my code (ignore memory leaks, just testing in an empty project..): composition = [[AVMutableComposition alloc] init]; NSString * path = [[NSBundle mainBundle] pathForResource:@"test" ofType:@"mp4"]; NSURL * url = [NSURL fileURLWithPath

AVAssetWriter How to write down-sampled/compressed m4a/mp3 files

筅森魡賤 提交于 2019-12-20 12:38:58
问题 I'm trying to take a local m4a or mp3 file and compress/down-sample this file (for the purposes of making a smaller file). Originally, I was using the AVAssetExportSession to export an AVAsset to a temp directory, but I didn't have any control over compression/down-sampling (you can only use presets, which of them, only .wav file formats support quality degradation). Then, following several examples here on SO, I tried using AVAssetReader/AVAssetWriter to preform this 'export'. I create my

AVAssetExportSession giving me a green border on right and bottom of output video

血红的双手。 提交于 2019-12-20 12:15:24
问题 Here's the code: AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality]; exporter.outputURL = outputUrl; exporter.outputFileType = AVFileTypeQuickTimeMovie; exporter.videoComposition = mainComposition; exporter.shouldOptimizeForNetworkUse = YES; [exporter exportAsynchronouslyWithCompletionHandler:^{ //completion }]; I've tried different quality settings. I always get a 1-2 pixel border running down the right

Why won't AVFoundation link with my XCode 3.2.3 iPhone 4.0.1 project?

与世无争的帅哥 提交于 2019-12-20 11:11:13
问题 I'm following the reference at http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html to capture video from the iPhone camera. It's a fresh project aside from the code from that page. I added the AVFoundation framework to the project as well. Here's the linker errors I get: Build my project of project my project with configuration Debug CompileC "build/my project.build/Debug-iphoneos/my project.build/Objects-normal/armv6/MainViewController.o" /Users/mwilliamson/Projects/my_project

Why won't AVFoundation link with my XCode 3.2.3 iPhone 4.0.1 project?

拈花ヽ惹草 提交于 2019-12-20 11:11:12
问题 I'm following the reference at http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html to capture video from the iPhone camera. It's a fresh project aside from the code from that page. I added the AVFoundation framework to the project as well. Here's the linker errors I get: Build my project of project my project with configuration Debug CompileC "build/my project.build/Debug-iphoneos/my project.build/Objects-normal/armv6/MainViewController.o" /Users/mwilliamson/Projects/my_project

How to capture image without displaying preview in iOS

和自甴很熟 提交于 2019-12-20 10:59:10
问题 I want to capture images at specific instances, for example when a button is pushed; but I don't want to show any video preview screen. I guess captureStillImageAsynchronouslyFromConnection is what I need to use for this scenario. Currently, I can capture image if I show a video preview. However, if I remove the code to show the preview, the app crashes with the following output: 2012-04-07 11:25:54.898 imCapWOPreview[748:707] *** Terminating app due to uncaught exception

Disable taking images in any orientation other than Portrait AVFoundation

给你一囗甜甜゛ 提交于 2019-12-20 10:37:45
问题 I am using AVFoundation to show the camera. I would like to prevent the camera itself to rotate so the viewer will see the camera only in portrait and the images will be taken only in portrait mode. I defined Supported Interface Orientation to support portrait only and the view itself is being displayed only in portrait mode, but not the camera - is being rotated with the device orientation How can I force the AVFoundation camera to be displayed and capture images only in portrait like the

How To Use AVCaptureStillImageOutput To Take Picture

痴心易碎 提交于 2019-12-20 10:34:38
问题 I have a preview layer that is pulling from the camera and working as it should. I would like to be able to take a picture when I press a button. I have inited the AVCaptureStillImageOutput like this: AVCaptureStillImageOutput *avCaptureImg = [[AVCaptureStillImageOutput alloc] init]; Then I am trying to take a picture using this object: [avCaptureImg captureStillImageAsynchronouslyFromConnection:(AVCaptureConnection *) completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError