avfoundation

File Format for Saving Video with alpha channel in iOS

孤者浪人 提交于 2019-12-24 12:42:09
问题 I am using AVFoundation to create a video and have added in an effect to clip the video so there is a clear background. What file format should I save this as to preserve the transparency in my iOS app. 回答1: AVAnimator is a library with which you can display video with an alpha channel on iOS, it is however not free to use for commercial products. I don't think it's natively possible. 来源: https://stackoverflow.com/questions/28258575/file-format-for-saving-video-with-alpha-channel-in-ios

Resizing CMSampleBufferRef provided by captureStillImageBracketAsynchronouslyFromConnection:withSettingsArray:completionHandler:

China☆狼群 提交于 2019-12-24 11:35:46
问题 In the app I'm working on, we're capturing photos which need to have 4:3 aspect ratio in order to maximize the field of view we capture. Up untill now we were using AVCaptureSessionPreset640x480 preset, but now we're in need of larger resolution. As far as I've figured, the only other two 4:3 formats are 2592x1936 and 3264x2448. Since these are too large for our use case, I need a way to downsize them. I looked into a bunch of options but did not find a way (prefereably without copying the

AVFoundation Session issue

纵饮孤独 提交于 2019-12-24 11:06:24
问题 I am working on an application that requires recording video and I found a helper class written by appcoda and here is a link to the github repo https://github.com/appcoda/FullScreenCamera but the problem I am having with it is that it has when ever I run it, I get an error in the console saying noCamerasAvailable followed by captureSessionIsMissing and I am also commited to Improving the code and I just cannot figure out why. Here is the helper class class CameraHelper: NSObject { var

crashing on video capture with AVFoundation

旧街凉风 提交于 2019-12-24 11:02:08
问题 I am trying to implement video capture in my app using AVFoundation. I have the following code under viewDidLoad: session = [[AVCaptureSession alloc] init]; movieFileOutput = [[AVCaptureMovieFileOutput alloc] init]; videoInputDevice = [[AVCaptureDeviceInput alloc] init]; AVCaptureDevice *videoDevice = [self frontFacingCameraIfAvailable]; if (videoDevice) { NSError *error; videoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error]; if (!error) { if ([session

How to keep low latency during the preview of video coming from AVFoundation?

雨燕双飞 提交于 2019-12-24 09:39:02
问题 Apple has a sample code called Rosy Writer that shows how to capture video and apply effects to it. During this section of the code, on the outputPreviewPixelBuffer part, Apple supposedly shows how they keep preview latency low by dropping stale frames. - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(

Swift AVAudioEngine: Changing the Audio Input Device for MacOS

拟墨画扇 提交于 2019-12-24 08:08:15
问题 I'm trying to change the input device used to listen to incoming audio. I've tried a number of solutions, but most end up with the following error when preparing and starting the audio-engine: AVAEInternal.h:82:_AVAE_CheckAndReturnErr: required condition is false: [AVAudioEngineGraph.mm:1295:Initialize: (IsFormatSampleRateAndChannelCountValid(outputHWFormat))] Current (simplified) code: var engine = AVAudioEngine() var inputDeviceID: AudioDeviceID = 41 // another audio input device let

Playing a live TuneIn Radio URL iOS Swift

喜欢而已 提交于 2019-12-24 08:07:23
问题 I am working on an app which is intended to play life radio by using TuneIn URLs. In TuneIn, there is an http get request in the API which provides a JSON with all the URL and its bitrates. So http://opml.radiotime.com/Tune.ashx?id=s150147&formats=aac,mp3&render=json will return { "head": { "status": "200"}, "body": [ { "element" : "audio", "url": "http://player.absoluteradio.co.uk/tunein.php?i=a664.aac", "reliability": 95, "bitrate": 64, "media_type": "aac", "position": 0, "player_width":

Playing a live TuneIn Radio URL iOS Swift

若如初见. 提交于 2019-12-24 08:06:08
问题 I am working on an app which is intended to play life radio by using TuneIn URLs. In TuneIn, there is an http get request in the API which provides a JSON with all the URL and its bitrates. So http://opml.radiotime.com/Tune.ashx?id=s150147&formats=aac,mp3&render=json will return { "head": { "status": "200"}, "body": [ { "element" : "audio", "url": "http://player.absoluteradio.co.uk/tunein.php?i=a664.aac", "reliability": 95, "bitrate": 64, "media_type": "aac", "position": 0, "player_width":

How do I find a scale between two different audio samples?

旧城冷巷雨未停 提交于 2019-12-24 07:03:13
问题 I'm planning to make an universal application that analyses audio samples. When I say 'universal' I mean that any technology (Javascript, C, Java, etc) can use it. Basically I made an application on iOS, using Apple's AVFoundation, that receives on real time the microphone samples at a lenght of 512 (bufferSize = 512). At Python I made the same thing, using PyAudio, but unfortunately I received very different values... Look the samples: Samples of bufferSize = 512 on iOS: [0.0166742969, 0

Toggle CAMERA flash on the push of a button; AVFoundation/Swift

扶醉桌前 提交于 2019-12-24 06:47:09
问题 Im using AVFoundation to create a custom camera. It needs to have a flash button that on press, toggles the flash on and off. I am not asking about torch, there are plenty of answers for torch. Because AVFoundation has been deprecated, most answers online don't work anymore. Also, I need to guard against the possibility the user doesn't have a front flash. This is the code I have tried with no luck: private func flashOn(device:AVCaptureDevice) { do{ if (device.hasFlash) { try device