avfoundation

AVFoundation - Retiming CMSampleBufferRef Video Output

笑着哭i 提交于 2019-12-02 14:58:34
First time asking a question here. I'm hoping the post is clear and sample code is formatted correctly. I'm experimenting with AVFoundation and time lapse photography. My intent is to grab every Nth frame from the video camera of an iOS device (my iPod touch, version 4) and write each of those frames out to a file to create a timelapse. I'm using AVCaptureVideoDataOutput, AVAssetWriter and AVAssetWriterInput. The problem is, if I use the CMSampleBufferRef passed to captureOutput:idOutputSampleBuffer:fromConnection: , the playback of each frame is the length of time between original input

How to record a video with avfoundation in Swift?

落爺英雄遲暮 提交于 2019-12-02 14:55:58
I am trying to figure out how to record a video using AVFoundation in Swift. I have got as far as creating a custom camera but I only figured out how to take still pictures with it and I can't figure out how to record video. From what I understand you have to use AVCaptureVideoDataOutput to get the data from the recording but I can't figure out how to start the recording and implement the delegate methods. The whole AVFoundation Programing Guide/Still and Video Media Capture is in Objective-C and I can't seem to decipher it out. Here's my attempt to accomplish this task: First I set up the

AVFoundation tap to focus feedback rectangle

六月ゝ 毕业季﹏ 提交于 2019-12-02 14:32:10
I am developing an iphone application where I directly use AVFoundation to capture videos via the camera. I've implemented a feature to enable the tap to focus function for a user. - (void) focus:(CGPoint) aPoint; { #if HAS_AVFF Class captureDeviceClass = NSClassFromString(@"AVCaptureDevice"); if (captureDeviceClass != nil) { AVCaptureDevice *device = [captureDeviceClass defaultDeviceWithMediaType:AVMediaTypeVideo]; if([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) { CGRect screenRect = [[UIScreen mainScreen] bounds]; double screenWidth =

ios endless video recording

自作多情 提交于 2019-12-02 14:15:35
I'm trying to develop an iPhone app that will use the camera to record only the last few minutes/seconds. For example, you record some movie for 5 minutes click "save", and only the last 30s will be saved. I don't want to actually record five minutes and then chop last 30s (this wont work for me). This idea is called "Loop recording" . This results in an endless video recording, but you remember only last part. Precorder app do what I want to do. (I want use this feature in other context) I think this should be easily simulated with a Circular buffer . I started a project with AVFoundation. It

How can I do fast image processing from the iPhone camera?

谁都会走 提交于 2019-12-02 14:12:54
I am trying to write an iPhone application which will do some real-time camera image processing. I used the example presented in the AVFoundation docs as a starting point: setting a capture session, making a UIImage from the sample buffer data, then drawing an image at a point via -setNeedsDisplay , which I call on the main thread. This works, but it is fairly slow (50 ms per frame, measured between -drawRect: calls, for a 192 x 144 preset) and I've seen applications on the App Store which work faster than this. About half of my time is spent in -setNeedsDisplay . How can I speed up this image

How to get Bytes from CMSampleBufferRef , To Send Over Network

家住魔仙堡 提交于 2019-12-02 14:10:44
Am Captuing video using AVFoundation frame work .With the help of Apple Documentation http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/03_MediaCapture.html%23//apple_ref/doc/uid/TP40010188-CH5-SW2 Now i did Following things 1.Created videoCaptureDevice 2.Created AVCaptureDeviceInput and set videoCaptureDevice 3.Created AVCaptureVideoDataOutput and implemented Delegate 4.Created AVCaptureSession - set input as AVCaptureDeviceInput and set output as AVCaptureVideoDataOutput 5.In AVCaptureVideoDataOutput Delegate method -(void)captureOutput:

What Techniques Are Best To Live Stream iPhone Video Camera Data To a Computer?

耗尽温柔 提交于 2019-12-02 13:57:07
I would like to stream video from an iPhone camera to an app running on a Mac. Think sorta like video chat but only one way, from the device to a receiver app (and it's not video chat). My basic understanding so far: You can use AVFoundation to get 'live' video camera data without saving to a file but it is uncompressed data and thus I'd have to handle compression on my own. There's no built in AVCaptureOutput support for sending to a network location, I'd have to work this bit out on my own. Am I right about the above or am I already off-track? Apple Tech Q&A 1702 provides some info on saving

AVPlayer and MPMoviePlayerController differences [closed]

痞子三分冷 提交于 2019-12-02 13:52:33
I am developing an iPhone application that needs to play videos. So far, I learned that there are at least two API's for achieving this; AVPlayer and MPMoviePlayerController . What are the main differences? Till NOTE as of iOS9, Apple has deprecated the MPMoviePlayerController: The MPMoviePlayerController class is formally deprecated in iOS 9. (The MPMoviePlayerViewController class is also formally deprecated.) To play video content in iOS 9 and later, instead use the AVPictureInPictureController or AVPlayerViewController class from the AVKit framework, or the WKWebView class from WebKit.

AVAudioPlayer Play audio on music plays on the sound box of phone calls

怎甘沉沦 提交于 2019-12-02 12:53:40
I have a simple code using AVAudioPlayer, to play a .caf audio: AppDelegate.h AVAudioPlayer *_audioPlayer; AppDelegate.m - (void)playAudio{ NSArray *dirPaths; NSString *docsDir; dirPaths = NSSearchPathForDirectoriesInDomains( NSDocumentDirectory, NSUserDomainMask, YES); docsDir = dirPaths[0]; NSString *soundFilePath = [docsDir stringByAppendingPathComponent:audiosrc]; NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath]; _audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:soundFileURL error:nil]; _audioPlayer.volume = 1.0; [_audioPlayer play]; } The audio play very well, but he

Setting Slider Value to Set SeekToTime in AVPlayer

最后都变了- 提交于 2019-12-02 11:58:29
问题 I am using Player library, that is using AVPlayer & AVFoundation, which is quiet convenient for my case. I successfully managed to play the video and add a slider. I set the slider's min to 0 and max to duration of the video.. At this point, in order to connect slider to current playtime, I used this answer, on StackOverflow. I setup a protocol and used addPeriodicTimeObserverForInterval , so slider is linked to the currentTime and moving as video moves along successfully. Now, here is my