avfoundation

Apply Core Image Filter to Video on OS X using Swift

早过忘川 提交于 2019-12-08 05:57:42
问题 I am planning to build an NSOpenGLView for an OS X app using SWIFT which can be used to apply Core Image Filter and effects to a video, so far I have worked on the code for the video Controller to add video playback, but I am not sure how to apply the filter to the video: class VideoMediaViewController: NSViewController { weak var mainView : DTMainViewController? @IBOutlet weak var aVPlayerView: AVPlayerView! var url:NSURL?{ didSet{ // this is the setter } } var observer:AnyObject? var player

AVFoundation - combine videos only one the first is displayed

风格不统一 提交于 2019-12-08 05:44:10
问题 I am trying to take a different approach at combining videos. I am creating a new track for each transformation. The problem with this code is that the first video is shown and all others are black. The audio overlay is correct for the entire segment. It looks like the video is not brought in to the composition because the size of the file is 5 M when it should be about 25M. The 5M size correlates to the size of the first clip and the audio track. All of the AVAssets appear to be valid. The

Avmutable composition , lost orientation when adding audio to a video made with frames

怎甘沉沦 提交于 2019-12-08 04:44:58
问题 I've been working on a project of video processing. Until now I succeeded in filtering live camera feeds, capturing still images, recording video from frames, recording audio, and lately succeeded in adding audio to the video capture. But it seems like the video has lost its orientation - it should be rotated by 90 degrees clockwise. I tried to use the AVmutablevideocomposition, but whatever I do, I keep getting the following error: [__NSArrayM objectAtIndex:]: index 0 beyond bounds for empty

Can't get pixel data via CGDataProviderCopyData using AVCaptureVideoDataOutput in swift 2

天涯浪子 提交于 2019-12-08 04:41:10
问题 I'm working on updating this for swift 2.0 and I currently get fatal error: unexpectedly found nil while unwrapping an Optional value on line: let data = CGDataProviderCopyData(CGImageGetDataProvider(imageRef)) as! NSData func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) { print("Capture output running") let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)

Can't Play AVPLayer Local Video from

不问归期 提交于 2019-12-08 04:37:56
问题 So I am building a custom video player using AVFoundation - AVPlayer and AVPlayerLayer . Currently, all I want the player to do is play a video in the asset library with a hardcoded url to that video. I would like this to be contained in a SubClass of UIView so I can use it all around my app. Here is my code so far: CUPlayer.h : @interface CUPlayer : UIView { AVPlayer *player; AVPlayerLayer *playerLayer; AVPlayerItem *item; NSURL *url; } @property(nonatomic) UIViewAutoresizing

getting unexpected result of crop video using AVFoundation

老子叫甜甜 提交于 2019-12-08 04:36:05
问题 I have video of 1920*1080(16:9) and I want to crop that video by middle pixel of 1080*1080(1:1). For that I wrote below code. -(void)cropVideo { AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init]; AVMutableCompositionTrack *track = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; AVMutableCompositionTrack *track1 = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID

How to observe torchLevel in swift?

本秂侑毒 提交于 2019-12-08 03:23:53
问题 How do I observe torchLevel? I have done the following from objective c solutions, no success: private var torchLevel = 0 let device = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo) device.addObserver(self, forKeyPath: "torchLevel", options: .New, context: &torchLevel) override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) { if context == &torchLevel { let device = object as!

AVFoundation crop captured still image according to the preview aspect ratio

帅比萌擦擦* 提交于 2019-12-08 02:45:25
问题 My question is mostly similar to this one: Cropping image captured by AVCaptureSession I have an application which uses AVFoundation for capturing still images. My AVCaptureVideoPreviewLayer has AVLayerVideoGravityResizeAspectFill video gravity thus making preview picture which is shown to the user to be cropped from the top and from the bottom parts. When user is pressing "Capture" button, the image actually captured is differs from the preview picture shown to user. My question is how to

Method captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection only called a few times

大兔子大兔子 提交于 2019-12-08 02:39:31
问题 I'm capturing audio from external bluetooth microphone. But I can't record anything. This method is only called one time, at the beginning of the current AvCaptureSession. - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection After that I never get called this method for process the audio. For instantiate the capture session I do this: self.captureSession.usesApplicationAudioSession = true;

AVPlayer Skips the Beginning of a Video

瘦欲@ 提交于 2019-12-08 01:48:23
问题 I'm having an issue with AVPlayer skipping the first 0.5 seconds of a video I'm playing when I attempt to play it immediately after pre-rolling. First, I create the player: self.videoPlayer = [[AVPlayer alloc] init]; Then, I add the player item to the player and add an observer to check and see when the item is ready: [self.videoPlayer addObserver:self forKeyPath:@"currentItem" options:0 context:kBLCurrentPlayerItemStatusContext]; [self.videoPlayer replaceCurrentItemWithPlayerItem:self