avfoundation

Best path from AVPlayerItemVideoOutput to openGL Texture

感情迁移 提交于 2019-12-03 08:29:47
Been pulling my hair out trying to figure out the current best path from AVFoundation videos to an openGLTexture, most of what I find is related to iOS, and I can't seem to make it work well in OSX. First of all, this is how I set up the videoOutput: NSDictionary *pbOptions = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_422YpCbCr8], kCVPixelBufferPixelFormatTypeKey, [NSDictionary dictionary], kCVPixelBufferIOSurfacePropertiesKey, nil]; self.playeroutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pbOptions]; self.playeroutput

Record video while other video is playing

本秂侑毒 提交于 2019-12-03 08:26:05
I am using UIImagePickerController to record a video. and am using AVPlayer to play a video. and adding AVPlayerLayer to UIImagePickerController's cameraOverlayView so that i can see video while recording. My requirement is I need to watch video while recording video using UIImagePickerController using headset i need to listen audio from playing video need to record my voice to recording video only my voice should be recorded but not playing video's audio. every thing working but 4. audio from playing video also mix with my voice. how to handle this case? My final goal is Out put for the

AVCaptureVideoPreviewLayer smooth orientation rotation

寵の児 提交于 2019-12-03 08:13:00
问题 I'm trying to disable any discernable orientation rotation to an AVCaptureVideoPreviewLayer while still maintaining rotation for any subviews. AVCaptureVideoPreviewLayer does have an orientation property, and changing it does allow for the layer to display properly for any orientation. However, the rotation involves some funky rotation of the AVCaptureVideoPreviewLayer, rather than staying smooth as it does in the Camera app. This is how I've gotten orientation to work properly, minus the

iOS AVFoundation audio/video out of sync

你离开我真会死。 提交于 2019-12-03 08:11:23
The Problem: During every playback, the audio is between 1-2 seconds behind the video. The Setup: The assets are loaded with AVURLAssets from a media stream. To write the composition, I'm using AVMutableCompositions and AVMutableCompositionTracks with asymmetric timescales. The audio and video are both streamed to the device. The timescale for audio is 44100; the timescale for video is 600. The playback is done with AVPlayer. Attempted Solutions: Using videoAssetTrack.timeRange for [composition insertTimeRange] . Using CMTimeRangeMake(kCMTimeZero, videoAssetTrack.duration); Using

Video not rotating using AVMutableVideoCompositionLayerInstruction

心不动则不痛 提交于 2019-12-03 08:07:43
I'm trying to merge two videos I get after recording using the camera as a UIImagePickerController. I've succeeded with combining the videos into one but I have some problems with the orientation of the videos. As I've understood it with the UIImagePickerController is that all videos are captured in landscape, this means that the videos recorded in portrait are rotated 90°. After each recording I add the new video to an array func imagePickerController(picker: UIImagePickerController!, didFinishPickingMediaWithInfo info:NSDictionary) { let tempImage = info[UIImagePickerControllerMediaURL] as

Hold multiple Frames in Memory before sending them to AVAssetWriter

三世轮回 提交于 2019-12-03 08:06:27
I need to hold some video frames from a captureSession in memory and write them to a file when 'something' happens. Similar to this solution , i use this code to put a frame into a NSMutableArray: - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection { //... CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); uint8 *baseAddress = (uint8*)CVPixelBufferGetBaseAddress(imageBuffer); NSData *rawFrame = [[NSData alloc] initWithBytes:(void*)baseAddress length:(height *

How to buffer audio using AVPlayer in iOS?

别说谁变了你拦得住时间么 提交于 2019-12-03 08:03:53
I want to play stream audio from the Internet. I wrote code that plays stream but it don't have any buffer so if signal is weak application stop playing audio. This is my code: import UIKit import AVFoundation import MediaPlayer import AudioToolbox class ViewController: UIViewController { var playerItem:AVPlayerItem? var player:AVPlayer? @IBOutlet weak var PlayButton: UIButton! override func viewDidLoad() { super.viewDidLoad() var buffer = AVAudioBuffer () let url = NSURL (string: "http://radio.afera.com.pl/afera64.aac") playerItem = AVPlayerItem(URL: url!) player = AVPlayer(playerItem:

How do I convert the live video feed from the iPhone camera to grayscale?

喜夏-厌秋 提交于 2019-12-03 08:00:20
问题 How would I take the live frames from the iPhone camera, convert them to grayscale, and then display them on the screen in my application? 回答1: To expand upon what Tommy said, you'll want to use AVFoundation in iOS 4.0 to capture the live camera frames. However, I'd recommend using OpenGL directly to do the image processing because you won't be able to achieve realtime results on current hardware otherwise. For OpenGL ES 1.1 devices, I'd look at using Apple's GLImageProcessing sample

AVFoundation: add text to the CMSampleBufferRef video frame

风流意气都作罢 提交于 2019-12-03 07:54:40
问题 I'm building an app using AVFoundation. Just before I call [assetWriterInput appendSampleBuffer:sampleBuffer] in - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection -method. I manipulate the pixels in the sample buffer (using a pixelbuffer to apply an effect). But the client wants me to put in a text (timestamp & framecounter) as well on the frames, but I haven't found a way to do this

Rotating video without rotating AVCaptureConnection and in the middle of AVAssetWriter session

落爺英雄遲暮 提交于 2019-12-03 07:44:23
问题 I'm using PBJVision to implement tap-to-record video functionality. The library doesn't support orientation yet so I'm in the process of trying to engineer it in. From what I see, there are three ways to rotate the video - I need help on deciding the best way forward and how to implement it. Note that rotation can happen between tap-to-record segments. So in a recording session, the orientation is locked to what it was when the user tapped the button. The next time the user taps the button to