avfoundation

How fast can iPhone to be programmed take 2 pictures at one time? [closed]

只愿长相守 提交于 2019-12-04 20:47:52
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center . Closed 7 years ago . I wish to write an iphone app that allows your to take 2 consecutive pictures in a very short time, and I wonder if it is achievable. May apps in the market seems to only take low resolution still frames out of video stream, so I wonder if fast capturing full resolution photo is feasible. Felix It depends on what

How to get video frame of the AVPlayer?

微笑、不失礼 提交于 2019-12-04 19:26:51
问题 I have PlayerView class for displaying AVPlayer's playback. Code from documentation. #import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> @interface PlayerView : UIView @property (nonatomic) AVPlayer *player; @end @implementation PlayerView + (Class)layerClass { return [AVPlayerLayer class]; } - (AVPlayer*)player { return [(AVPlayerLayer *)[self layer] player]; } - (void)setPlayer:(AVPlayer *)player { [(AVPlayerLayer *)[self layer] setPlayer:player]; } @end I set up my AVPlayer

How to extract motion vectors from H.264 AVC CMBlockBufferRef after VTCompressionSessionEncodeFrame

隐身守侯 提交于 2019-12-04 19:20:31
I'm trying read or understand CMBlockBufferRef representation of H.264 AVC 1/30 frame. The buffer and the encapsulating CMSampleBufferRef is created by using VTCompressionSessionRef . https://gist.github.com/petershine/de5e3d8487f4cfca0a1d H.264 data is represented as AVC memory buffer, CMBlockBufferRef from the compressed sample. Without fully decompressing again , I'm trying to extract motion vectors or predictions from this CMBlockBufferRef . I believe that for the fastest performance, byte-by-byte reading from the data buffer using CMBlockBufferGetDataPointer() should be necessary. However

How to detect ear piece (speaker) availability on iOS devices?

穿精又带淫゛_ 提交于 2019-12-04 19:15:22
I want to detect specially on iPad's if there is ear piece available or not. For example - I can detect if the iOS device hasTourch or not using AVFoundation so is there any way to detect ear piece availability. 1) If you want to check if Ear piece (Receiver speaker) is available on device You can identify this by simply identifying if Device is iPhone. UIDevice.current.userInterfaceIdiom == .phone in iOS prottype AVAudioSessionPortBuiltInReceiver is there for builtInreceriver speaker. and according to apple's documentation , This is available only on iPhone device. So there is no need to

BGRA on iPhone glTexImage2D and glReadPixels

半腔热情 提交于 2019-12-04 19:02:51
Looking at the docs, I should be able to use BGRA for the internal format of a texture. I am supplying the texture with BGRA data (using GL_RGBA8_OES for glRenderbufferStorage as it seems BGRA there is not allowed). However, the following does not work: glTexImage2D(GL_TEXTURE_2D, 0, **GL_BGRA**, w, h, 0, GL_BGRA, GL_UNSIGNED_BYTE, buffer); ... glReadPixels(0, 0, w,h, GL_BGRA, GL_UNSIGNED_BYTE,buffer, 0); While this gives me a black frame: glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h, 0, GL_BGRA, GL_UNSIGNED_BYTE, buffer); ... glReadPixels(0, 0, w,h, **GL_BGRA**, GL_UNSIGNED_BYTE,buffer, 0);

IOS adding UIProgressView to AVFoundation AVCaptureMovieFileOutput

元气小坏坏 提交于 2019-12-04 18:53:22
I am using AVCaptureMovieFileOutput to record videos and I want to add a UIProgressView to represent how much time there is left before the video stops recording. I set a max duration of 15 seconds: CMTime maxDuration = CMTimeMakeWithSeconds(15, 50); [[self movieFileOutput] setMaxRecordedDuration:maxDuration]; I can't seem to find if AVCaptureMovieFileOutput has a callback for when the video is recording or for when recording begins. My question is, how can I get updates on the progress of the recording? Or if this isn't something that is available, how can I tell when recording begins in

Preloading video to play without delay

不想你离开。 提交于 2019-12-04 18:32:08
there are tons of topics on SO about video preloading but still isn't crystal clear for me. Objectives: Load video from the network, URL is given Wait for video loaded completely Play video without delay (as I said it's already buffered 100%) Ideally, calculate download speed, predict f.e when buffered 60% of video, we start playing and 40% will be buffered while playing without delay. what I tried: NSURL *url = [NSURL URLWithString:@"video url address here"]; AVURLAsset *avasset = [[AVURLAsset alloc] initWithURL:url options:nil]; AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset

AV Foundation camera preview layer gets zoomed in, how to zoom out?

你。 提交于 2019-12-04 18:30:31
问题 The application currently I am using has a main functionality to scan QR/Bar codes continuously using Zxing library (http://code.google.com/p/zxing/). For continuous frame capturing I used to initialize the AVCaptureSession and AVCaptureVideoOutput, AVCaptureVideoPreviewLayer described in the apple Q&A http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html. My problem is, when I used to run the camera preview, the image I can see through the Video device is much larger (1.5x) than

AVQueuePlayer playing several audio tracks in background iOS5

回眸只為那壹抹淺笑 提交于 2019-12-04 18:27:55
I used AVQueuePlayer to play several items in background. And code worked perfect in iOS4. And in iOS5 AVQueuePlayer changed its behavior, so player stops playing after first item is ended. Matt Gallagher wrote a hint in this post . "As of iOS 5, it appears that AVQueuePlayer no longer pre-buffers. It did pre-buffer the next track in iOS 4." So my question is how to play several items in background using AVPlayer or AVQueuePlayer in iOS5. Matt Gallagher's answer in his blog : "You must observe the current item in the AVQueuePlayer. When it changes, you must use UIApplication to start a

m3u8 file AVAssetImageGenerator error

时光总嘲笑我的痴心妄想 提交于 2019-12-04 17:38:28
问题 I am using AVPlayer to play .m3u8 file. Using AVAssetImageGenerator to extract image out of it using following code : AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:mp.contentURL options:nil]; AVAssetImageGenerator *generate1 = [[AVAssetImageGenerator alloc] initWithAsset:asset1]; generate1.appliesPreferredTrackTransform = YES; NSError *err = NULL; CMTime time = CMTimeMake(1, 2); CGImageRef oneRef = [generate1 copyCGImageAtTime:time actualTime:NULL error:&err]; img = [[UIImage alloc]