avfoundation

Inserting an HTTP stream into a AVMutableComposition

我们两清 提交于 2019-11-30 06:48:04
I am trying to insert an AVURLAsset of a AVPlayerItem that states AVPlayerItemStatusReadyToPlay into an AVMutableComposition like this: composition_ = [[AVMutableComposition alloc] init]; insertionPoint_ = kCMTimeZero; item_ = [[AVPlayerItem playerItemWithURL:[NSURL URLWithString:@"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"]] retain]; [item_ addObserver:self forKeyPath:@"status" options:0 context:nil]; player_ = [[AVPlayer playerWithPlayerItem:item_] retain]; [player_ addObserver:self forKeyPath:@"currentItem.duration" options:0 context:nil]; /** * append a player-item to

Square cropping and fixing the video orientation in iOS

不羁岁月 提交于 2019-11-30 06:23:36
问题 I am capturing the video using UIImagePickerController, i can crop the video using the following code, AVAsset *asset = [AVAsset assetWithURL:url]; //create an avassetrack with our asset AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; //create a video composition and preset some settings AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition]; videoComposition.frameDuration = CMTimeMake(1, 30); //here we are

UIImage created from CMSampleBufferRef not displayed in UIImageView?

冷暖自知 提交于 2019-11-30 06:20:27
问题 I'm trying to display a UIImage in real-time coming from the camera, and it seems that my UIImageView is not displaying the image properly. This is the method which a AVCaptureVideoDataOutputSampleBufferDelegate has to implement - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { // Create a UIImage from the sample buffer data UIImage *theImage = [self imageFromSampleBuffer:sampleBuffer

Cropping AVAsset video with AVFoundation

不问归期 提交于 2019-11-30 06:16:13
问题 I am using AVCaptureMovieFileOutput to record some video. I have the preview layer displayed using AVLayerVideoGravityResizeAspectFill which zooms in slightly. The problem I have is that the final video is larger, containing extra image that didn't fit on the screen during preview. This is the preview and resulting video Is there a way I can specify a CGRect that I want to cut from the video using AVAssetExportSession ? EDIT ---- When I apply a CGAffineTransformScale to the AVAssetTrack it

Can I use AVFoundation to stream downloaded video frames into an OpenGL ES texture?

情到浓时终转凉″ 提交于 2019-11-30 06:14:43
问题 I've been able to use AVFoundation's AVAssetReader class to upload video frames into an OpenGL ES texture. It has a caveat, however, in that it fails when used with an AVURLAsset that points to remote media. This failure isn't well documented, and I'm wondering if there's any way around the shortcoming. 回答1: There's some API that was released with iOS 6 that I've been able to use to make the process a breeze. It doesn't use AVAssetReader at all, and instead relies on a class called

Animate AVPlayerLayer videoGravity property

痴心易碎 提交于 2019-11-30 05:49:19
I'm trying to copy Apple's behavior in video playback that allows the user to stretch the video image to fill the bounds. @interface FHVideoPlayerView : UIView @end @interface FHVideoPlayerView + (Class)layerClass { return [AVPlayerLayer class]; } - (void)setAspectMode:(FHVideoPlayerAspectMode)aspectMode animated:(BOOL)animated { FHVideoPlayerAspectMode current = [self aspectMode]; FHVideoPlayerAspectMode final = aspectMode; NSString *fromValue; NSString *toValue; AVPlayerLayer *layer = (AVPlayerLayer *)[self layer]; switch (current) { case FHVideoPlayerAspectFill: fromValue =

Unable to trim a video using AVAssetExportSession

我的未来我决定 提交于 2019-11-30 05:44:40
I want to trim a video: -(void)trimVideo:(NSURL*)outputURL { //[[NSFileManager defaultManager] removeItemAtURL:outputURL error:nil]; AVURLAsset *asset = [AVURLAsset URLAssetWithURL:outputURL options:nil]; AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetLowQuality]; NSString * outputFilePath = NSHomeDirectory(); outputFilePath = [outputFilePath stringByAppendingPathComponent:@"Library"]; outputFilePath = [outputFilePath stringByAppendingPathComponent:@"temp.mov"]; NSURL * outputFileUrl = [NSURL fileURLWithPath:outputFilePath]

AVURLAsset cannot load with remote file

隐身守侯 提交于 2019-11-30 05:38:37
问题 I have a problem using AVURLAsset. NSString * const kContentURL = @ "http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"; ... NSURL *contentURL = [NSURL URLWithString:kContentURL]; AVURLAsset *asset = [AVURLAsset URLAssetWithURL:contentURL options:nil]; [asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:^{ ... NSError *error = nil; AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error]; ... } In the completion block,

How to watermark your video with different images and different CMTimes using AVFoundation

寵の児 提交于 2019-11-30 05:30:53
I'm using AVFoundation to put a watermark in my movies. This works well with the code that's been going around on the internet and Apple. But I don't want to show the watermark the complete time and I want to show different watermarks in the same movie. I've an AVAsset: NSString *path = [[NSBundle mainBundle] pathForResource:@"test" ofType:@"MOV"]; NSURL *url = [[NSURL alloc] initFileURLWithPath: path]; avasset_camera = [AVAsset assetWithURL:url]; An AVMutableComposition: AVMutableComposition *mix = [AVMutableComposition composition]; The UIImage converted to a CALayer and than added to

AVFoundation Image orientation off by 90 degrees in the preview but fine in Camera roll

柔情痞子 提交于 2019-11-30 05:04:10
Something really strange is happening, I am trying to capture an image using AVFoundation, the Camera roll image seems just fine, but the image preview has the image rotated by 90 degrees. This is the code I am using to capture an image AVCaptureConnection *videoConnection = nil; for (AVCaptureConnection *connection in stillImageOutput.connections) { for (AVCaptureInputPort *port in [connection inputPorts]) { if ([[port mediaType] isEqual:AVMediaTypeVideo] ) { videoConnection = connection; break; } } if (videoConnection) { break; } } //NSLog(@"about to request a capture from: %@",