avfoundation

How do I export UIImage array as a movie in Swift 3?

孤者浪人 提交于 2019-12-05 01:26:40
问题 I need to export an array of UIImage and build a movie putting some text in front of the image and if it's possible music also. Can you give me an hand with code? I only have found something with Objective-c and old version of Swift . 回答1: This is the first answers I posted to the question: create movie from [UIImage], Swift Following is a copy of the answer: I convert the objective-c code that posted by ’@Cameron E‘ to Swift 3, and It's working. the answer's link:@Cameron E's CEMovieMaker

Core Image - rendering a transparent image on CMSampleBufferRef result in black box around it

。_饼干妹妹 提交于 2019-12-05 01:23:09
问题 I'm trying to add a watermark/logo on a video that I'm recording using AVFoundation's AVCaptureVideoDataOutput. My class is set as the sampleBufferDelegate and receives the CMSamplebufferRefs. I already apply some effects to the CMSampleBufferRefs CVPixelBuffer and pass it back to the AVAssetWriter. The logo in the top left corner is delivered using a transparent PNG. The problem I'm having is that the transparent parts of the UIImage are black once written to the video. Anyone have an idea

What am I doing wrong while record video avfoundation with Swift?

风格不统一 提交于 2019-12-05 00:36:59
问题 I'm recording video with AVFoundation in swift but I don't see the file.mp4. I don't know if I'm recording and I'm saving bad or I'm not recording.. Because I can show the session preview, all components made his function correctly.. My code is: import UIKit import AVFoundation class ViewController: UIViewController, AVCaptureFileOutputRecordingDelegate { var delegate : AVCaptureFileOutputRecordingDelegate? @IBOutlet var imageView : UIImageView @IBOutlet var imagePreview : UIView var session

AVCaptureDeviceOutput not calling delegate method captureOutput

回眸只為那壹抹淺笑 提交于 2019-12-05 00:02:30
I am building an iOS application (my first) that processes video still frames on the fly. To dive into this, I followed an example from the AV* documentation from Apple. The process involves setting up an input (the camera) and an output. The output works with a delegate, which in this case is the controller itself (it conforms and implements the method needed). The problem I am having is that the delegate method never gets called. The code below is the implementation of the controller and it has a couple of NSLogs. I can see the "started" message, but the "delegate method called" never shows.

Seeking accurately, as opposed to two seconds short, in AVPlayer

北战南征 提交于 2019-12-04 23:45:49
I'm using AVPlayer in a Cocoa app, and I've implemented a command that jumps to the end of the video. The problem is, AVPlayer doesn't seek to where I told it to. For example, one of the videos I have is 4 minutes and 14 seconds long. When I seek to the end, AVPlayer seeks to 4 minutes and 12 seconds—two seconds short. If I then hit play, the player will play for two seconds, then reach the end. My first attempt was this: [self.player seekToTime:self.player.currentItem.duration]; I've switched it to this: [self.player seekToTime:self.player.currentItem.duration toleranceBefore

Capture picture from video using AVFoundation

安稳与你 提交于 2019-12-04 22:23:10
I am trying to capture a picture from a video on my iPad. I used Apple's AVCam example as a starting point. I was able to see the video in my application and to take pictures from it. My problem is that the pixel size of the result image is wrong. I want a fullscreen picture (1024x768) but I get a smaller one (1024x720). Those are my instance variables: @property (retain) AVCaptureStillImageOutput *stillImageOutput; @property (retain) AVCaptureVideoPreviewLayer *previewLayer; @property (retain) AVCaptureSession *captureSession; @property (retain) AVCaptureConnection *captureConnection;

iPhone Mic volume

妖精的绣舞 提交于 2019-12-04 21:55:33
Is there any way to poll the mic input volume with AVFoundation? I have seen the CoreAudio examples like SpeakHere but I really only need the current value. Is CA my only option? You should checkout SCListener. It's a simple way just to get the average or peak level of the mic. http://github.com/stephencelis/sc_listener 来源: https://stackoverflow.com/questions/477020/iphone-mic-volume

SPS values for H 264 stream in iPhone

僤鯓⒐⒋嵵緔 提交于 2019-12-04 21:48:21
Can someone point me to documentation that will help me get correct SPS and PPS values for iPhone. Question is a bit unclear... Picture Parameter Set is described in the latest ITU-T release of the standard in chapter 7.3.2.2 Sequence Parameter Set is described in chapter 7.3.2.1. You can encode a single frame to a file and then extract the sps and pps from that file. I have an example that shows how to do exactly that at http://www.gdcl.co.uk/2013/02/20/iOS-Video-Encoding.html I am sure you know, but you can only save H264 encoded video into a file(.mp4, .mov) on iOS. There is no access to

is it possible to set GIF image with video?

删除回忆录丶 提交于 2019-12-04 21:16:54
I am trying to combine video with GIF image, For this I am using MainCompositionInst.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer]; and in the video layer I was set GIF image but unfortunately it was not animating, So my question is that is it possible to do this ? please suggest me.. Thanks in advance. Apple's support for GIF is fairly limited. You could use this code to convert from GIF to Video: (With the current code the gif will be cropped to 480x480. For some resolutions the output

Auto Focus and Auto Exposure in AVFoundation on Custom Camera Layer

允我心安 提交于 2019-12-04 20:53:26
问题 What is the best way to create an accurate Auto Focus and Exposure for AVFoundation custom layer camera?, for example, currently my camera preview layer is square, I would like the camera focus and exposure to be specify to that frame bound. I need this in Swift 2 if possible, if not please write your answer I would be able to convert it myself. Current Auto Focus and Exposure: But as you can see this will evaluate the entire view when focusing. override func touchesBegan(touches: Set<UITouch