avfoundation

Rotate CGImage taken from video frame

和自甴很熟 提交于 2019-11-28 19:32:31
This is Apple's code (from Technical Q&A QA1702) for getting a UIImage from a video buffer. Unfortunately, the image returned is rotated 90 degrees. How do I edit this so that the image returned is correctly oriented? - (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer { CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); CVPixelBufferLockBaseAddress(imageBuffer, 0); void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); size_t width = CVPixelBufferGetWidth(imageBuffer); size_t height

How do I convert a CGImage to CMSampleBufferRef?

天大地大妈咪最大 提交于 2019-11-28 19:18:16
问题 I’d like to convert a CGImage to CMSampleBufferRef and append it to a AVAssetWriterInput using the appendSampleBuffer: method. I’ve managed to get the CMSampleBufferRef using the following code, but the appendSampleBuffer: simply returns NO when I supply the resulting CMSampleBufferRef . What am I doing wrong? - (void) appendCGImage: (CGImageRef) frame { const int width = CGImageGetWidth(frame); const int height = CGImageGetHeight(frame); // Create a dummy pixel buffer to try the encoding //

How to properly export CALayer on top of AVMutableComposition with AVAssetExportSession

时光总嘲笑我的痴心妄想 提交于 2019-11-28 18:58:16
I know this question was asked before (for example here and here ) but I just can't figure out what I'm doing wrong. I have an AVMutableComposition that I use to combine some video clips with some CALayer s that animate on top of them. Everything works ok when I take my AVMutableComposition and combine it with an AVSynchronizedLayer for playback inside an AVPlayerLayer . The video comes out correctly and everything is positioned where it should. My problem is that when I tried to export this thing I tried using AVVideoCompositionCoreAnimationTool instead of AVSynchronizedLayer (that's what the

420YpCbCr8BiPlanarVideoRange To YUV420 ?/How to copy Y and Cbcr plane to Single plane?

不羁岁月 提交于 2019-11-28 18:58:12
i have captured video using AVFoundation .i have set (video setting )and get in outputsamplebuffer kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange format. But i need YUV420 format for further processing . My doubt is 1.difference among 420YpCbCr8BiPlanarVideoRange,420YpCbCr8BiPlanarFULLRange, 420YpCbCr8PlanarFullRange,420YpCbCr8Planar and YUV420 ? 2.how can i convert 420YpCbCr8BiPlanarVideoRange to YUV420 ? 3. How to Convert YUV420 To 32BGRA ? 4) or some other Way to do this??? that is Any open source library or Apple Framework.... i have gone through Accelerate framework ...... it has image

Is it possible to render AVCaptureVideoPreviewLayer in a graphics context?

南楼画角 提交于 2019-11-28 18:47:55
This seems like a simple task, yet it is driving me nuts. Is it possible to convert a UIView containing AVCaptureVideoPreviewLayer as a sublayer into an image to be saved? I want to create an augmented reality overlay and have a button save the picture to the camera roll. Holding the power button + home key captures the screenshot to the camera roll, meaning that all of my capture logic is working, AND the task is possible. But I cannot seem to be able to make it work programmatically. I'm capturing a live preview of the camera's image using AVCaptureVideoPreviewLayer . All of my attempts to

iOS AVFoundation Export Session is missing audio

房东的猫 提交于 2019-11-28 18:41:01
I'm am using the iOS AVFoundation framework and I am able to successfully merge video tracks, with image overlays, and text overlays. However, my output file doesn't keep the audio intact from my original source video. How can I make sure that the audio source from one of my videos stays with the new video I create? EDIT *Use this code to have a good example of how to accomplish this creating a video (with original audio). It was not obvious to me that I need to include the audio track seperatly when processing a video with AVFoundation. Hope this helps somebody else. AVAssetTrack *videoTrack

How to stream a video with AVURLAsset and save to disk the cached data

人走茶凉 提交于 2019-11-28 18:29:42
Some days ago I was asked to check how difficult is to play a video while downloading it from Internet. I know it's an easy task because someone told me a while ago. So, I checked and it was super easy. The problem was that I wanted to save to disk the video to do not force the user to download it again and again. The problem was to access the buffer and store it to disk. Many answers in Stackoverflow says it is nor possible. Specially with videos. My original code to play the video: import AVFoundation .... //MARK: - Accessors lazy var player: AVPlayer = { var player: AVPlayer = AVPlayer

AVAssetImageGenerator provides images rotated

放肆的年华 提交于 2019-11-28 18:07:54
When obtaining a UIImage of a video via AVAssetImageGenerator, I'm getting back images rotated (well, technically they're not) when the video is shot in portrait orientation. How can I tell what orientation the video was shot and then rotate the image properly? AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil]; AVAssetImageGenerator *generate = [[AVAssetImageGenerator alloc] initWithAsset:asset]; NSError *err = NULL; CMTime time = CMTimeMake(0, 60); CGImageRef imgRef = [generate copyCGImageAtTime:time actualTime:NULL error:&err]; [generate release]; UIImage *currentImg = [

Square cropping and fixing the video orientation in iOS

China☆狼群 提交于 2019-11-28 18:02:32
I am capturing the video using UIImagePickerController, i can crop the video using the following code, AVAsset *asset = [AVAsset assetWithURL:url]; //create an avassetrack with our asset AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; //create a video composition and preset some settings AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition]; videoComposition.frameDuration = CMTimeMake(1, 30); //here we are setting its render size to its height x height (Square) videoComposition.renderSize = CGSizeMake

Swift Merge audio and video files into one video

妖精的绣舞 提交于 2019-11-28 18:02:17
I wrote a program in Swift.I want to merge a video with an audio file, but got this error. "failed Error Domain=AVFoundationErrorDomain Code=-11838 "Operation Stopped" UserInfo=0x17da4230 {NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The operation is not supported for this media.}" code func mergeAudio(audioURL: NSURL, moviePathUrl: NSURL, savePathUrl: NSURL) { var composition = AVMutableComposition() let trackVideo:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID()) let trackAudio