avfoundation

What Techniques Are Best To Live Stream iPhone Video Camera Data To a Computer?

有些话、适合烂在心里 提交于 2019-12-03 00:12:24
问题 I would like to stream video from an iPhone camera to an app running on a Mac. Think sorta like video chat but only one way, from the device to a receiver app (and it's not video chat). My basic understanding so far: You can use AVFoundation to get 'live' video camera data without saving to a file but it is uncompressed data and thus I'd have to handle compression on my own. There's no built in AVCaptureOutput support for sending to a network location, I'd have to work this bit out on my own.

AVAudioSession AVAudioSessionCategoryPlayAndRecord glitch

佐手、 提交于 2019-12-02 23:52:35
I would like to record videos with audio using AVCaptureSession . For this I need the AudioSessionCategory AVAudioSessionCategoryPlayAndRecord , since my app also plays back video with sound. I want audio to be audible from the default speaker and I want it to mix with other audio. So I need the options AVAudioSessionCategoryOptionDefaultToSpeaker | AVAudioSessionCategoryOptionMixWithOthers . If I do the following while other audio is playing there is a clear audible glitch in the audio from the other app: [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord

How To Use AVCaptureStillImageOutput To Take Picture

爱⌒轻易说出口 提交于 2019-12-02 23:22:17
I have a preview layer that is pulling from the camera and working as it should. I would like to be able to take a picture when I press a button. I have inited the AVCaptureStillImageOutput like this: AVCaptureStillImageOutput *avCaptureImg = [[AVCaptureStillImageOutput alloc] init]; Then I am trying to take a picture using this object: [avCaptureImg captureStillImageAsynchronouslyFromConnection:(AVCaptureConnection *) completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) { }]; I need help on how to take a picture and save it in a variable. Thanks vfxdrummer You need to

How to generate an UIImage from AVCapturePhoto with correct orientation?

回眸只為那壹抹淺笑 提交于 2019-12-02 23:03:07
I am calling AVFoundation 's delegate method to handle a photo capture, but I am having difficulty converting the AVCapturePhoto it generates into an UIImage with the correct orientation. Although the routine below is successful, I always get a right-oriented UIImage ( UIImage.imageOrientation = 3). I have no way of providing an orientation when using the UIImage(data: image) and attempting to first use photo.cgImageRepresentation()?.takeRetainedValue() also doesn't help. Please assist. Image orientation is critical here as the resulting image is being fed to a Vision Framework workflow. func

Recording, modifying and playing audio on iOS

故事扮演 提交于 2019-12-02 21:41:05
问题 EDIT: In the end I used exactly as I explained below, AVRecorder for recording the speech and openAL for the pitch shift and playback. It worked out quite well. I got a question regarding recording, modifying and playing back audio. I asked a similar question before ( Record, modify pitch and play back audio in real time on iOS ) but I now have more information and could do with some further advice please. So firstly this is what I am trying to do (on a separate thread to the main thread):

How do I convert the live video feed from the iPhone camera to grayscale?

萝らか妹 提交于 2019-12-02 21:29:54
How would I take the live frames from the iPhone camera, convert them to grayscale, and then display them on the screen in my application? To expand upon what Tommy said, you'll want to use AVFoundation in iOS 4.0 to capture the live camera frames. However, I'd recommend using OpenGL directly to do the image processing because you won't be able to achieve realtime results on current hardware otherwise. For OpenGL ES 1.1 devices, I'd look at using Apple's GLImageProcessing sample application as a base (it has an OpenGL greyscale filter within it) and running your live video frames through that.

Exporting videos on iOS: understanding and setting frame duration property?

杀马特。学长 韩版系。学妹 提交于 2019-12-02 21:26:22
问题 In this tutorial on merging videos, the author sets the frame duration for the exported video to 30 FPS. 1) Instead of fixing the frame duration to 30 FPS, shouldn't the frame duration be tied to the frame duration of the videos getting merged? 2) When exporting videos, what are the pros/cons of using a different FPS for the exported video that differs from the source video(s)? Is this one way of speeding up export time at the expense of video quality? For instance, what if the source videos

AVFoundation: add text to the CMSampleBufferRef video frame

本小妞迷上赌 提交于 2019-12-02 21:17:55
I'm building an app using AVFoundation. Just before I call [assetWriterInput appendSampleBuffer:sampleBuffer] in - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection -method. I manipulate the pixels in the sample buffer (using a pixelbuffer to apply an effect). But the client wants me to put in a text (timestamp & framecounter) as well on the frames, but I haven't found a way to do this yet. I tried to convert the samplebuffer to an Image, apply text on the image, and convert the image back

AVCaptureVideoPreviewLayer smooth orientation rotation

强颜欢笑 提交于 2019-12-02 20:50:45
I'm trying to disable any discernable orientation rotation to an AVCaptureVideoPreviewLayer while still maintaining rotation for any subviews. AVCaptureVideoPreviewLayer does have an orientation property, and changing it does allow for the layer to display properly for any orientation. However, the rotation involves some funky rotation of the AVCaptureVideoPreviewLayer, rather than staying smooth as it does in the Camera app. This is how I've gotten orientation to work properly, minus the hitch in the rotation: - (void)didRotateFromInterfaceOrientation:(UIInterfaceOrientation

Consecutive calls to startRecordingToOutputFileURL:

爷,独闯天下 提交于 2019-12-02 20:33:57
The Apple docs seem to indicate that while recording video to a file, the app can change the URL on the fly with no problem. But I'm seeing a problem. When I try this, the recording delegate gets called with an error... The operation couldn’t be completed. (OSStatus error -12780.) Info dictionary is: { AVErrorRecordingSuccessfullyFinishedKey = 0; } (funky single quote in "couldn't" comes from logging [error localizedDescription]) Here's the code, which is basically tweaks to WWDC10 AVCam sample: 1) Start recording. Start timer to change the output URL every few seconds - (void)