avfoundation

Detecting AVPlayer video start stop events

余生长醉 提交于 2019-12-21 20:13:02
问题 Here is nice simple avplayer piece of code playing a small collection of videos in a queue. My question. I actually want to pause between videos on my queue. Is it possible? I did note that rate fires twice; status fires just once as does notification. import UIKit import AVKit import AVFoundation class ViewController: UIViewController { @IBOutlet weak var VideoView: UIView! var player:AVQueuePlayer = AVQueuePlayer() @IBAction func NextSlide(sender: AnyObject) { player.play() } override func

How do i save a video (mp4 format) using AVCaptureVideoDataOutput?

烈酒焚心 提交于 2019-12-21 18:40:09
问题 I have set up the input, and the output for the AVCapture session and also the delegate - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection is getting called. How do I convert the frames to a mp4 video file and save it? 回答1: Use an AVAssetWriter to compress the data and write to MP4. These two samples contain code that does this: http://www.gdcl.co.uk/2013/02/20/iPhone-Pause.html http:/

How to add external WebVTT subtitles into HTTP Live Stream on iOS client

大城市里の小女人 提交于 2019-12-21 17:25:17
问题 We have videos encoded via bitmovin.com and provided as HTTP Live Streams (Fairplay HLS), but subtitles although in WebVTT format are exposed separately as direct URLs for the whole file, not individual segments and are not part of the HLS m3u8 playlist. I am looking for the way how an external .vtt file downloaded separately can still be included in the HLS stream and be available as a subtitle in AVPlayer. I know Apple's recommendation is to include segmented VTT subtitles into the HLS

ExposureMode AVCaptureExposureModeAutoExpose is not supported in iPhone

a 夏天 提交于 2019-12-21 16:50:21
问题 I'm trying to manipulate the touch event and manually adjust the focus and exposure to fit the CGPoint a user has pressed. I'm taking the device object and uses setFocusPointOfInterest and setExposurePointOfInterest in order to do the manipulation. It seems like the focus is working pretty good but when I try to set the Exposure mode to AVCaptureExposureModeAutoExpose it crashes with the cause : " Setting exposureMode (%d) is not supported by this device ." At the beginning I thought it's

Why does CMSampleBufferGetImageBuffer return NULL

谁说胖子不能爱 提交于 2019-12-21 13:40:28
问题 I have built some code to process video files on OSX, frame by frame. The following is an extract from the code which builds OK, opens the file, locates the video track (only track) and starts reading CMSampleBuffers without problem. However each CMSampleBufferRef I obtain returns NULL when I try to extract the pixel buffer frame. There's no indication in iOS documentation as to why I could expect a NULL return value or how I could expect to fix the issue. It happens with all the videos on

Weird behaviour of AVMutableComposition freezing while using AVMutableVideoComposition

馋奶兔 提交于 2019-12-21 13:14:06
问题 I am trying to merge multiple videos using AVMutableComposition . The problem I face is that whenever I try to add any AVMutableVideoComposition for applying any instructions, my playback freezes in AVPlayer at exact 6 seconds duration. Another interesting thing is that it plays fine if I play it in Photos app of iPad after exporting using same videoComposition . So why does it freezes in AVPlayer at 6 seconds? Code: AVMutableComposition *mutableComposition = [AVMutableComposition composition

Progressive Video Download on iOS

你。 提交于 2019-12-21 12:41:01
问题 I am trying to implement Progressive Downloading of a video in my iOS application that can be played through AVPlayer. I have already implemented a downloader module that can download the files to the iPad. However, I have discovered I cannot play a file that is still being written to So, as far as I can tell, my only solution would be through downloading a list of file 'chunks' and then keep playing through every file as they are ready (ie: downloaded), probably using HLS Searching I have

iOS Background audio recording

こ雲淡風輕ζ 提交于 2019-12-21 12:27:47
问题 I know that if I start an audio recording session in foreground, with Audio, Airplay, and Picture in Picture activated in Capabilities -> Background Modes; I am able to continue to record even in background, but only if I start the recording session in foreground and then I go in background. My problem is that I want to start the voice recording session from background, which might seem shaddy and not what Apple wants, but the use case is like this: I have a Bluetooth LE device with buttons

Swift AVCaptureSession Close Open Button Error : Multiple audio/video AVCaptureInputs are not currently supported

时光怂恿深爱的人放手 提交于 2019-12-21 12:13:29
问题 I have a working barcode scanner code. When I click the openCamera button, first time everything is good. When I click the closeCamera button, good, but if I click again the openCamera button gives a fatal error. Code and error are below. In fact, is it possible to toggle camera view with one button? // Barcode Camera Properties let captureSession = AVCaptureSession() var captureDevice:AVCaptureDevice? var captureLayer:AVCaptureVideoPreviewLayer? override func viewDidLoad() { super

iPhone trim audio recording

左心房为你撑大大i 提交于 2019-12-21 06:44:39
问题 I have a voice memo component in my app, and I want to allow the user to trim the audio, similar to QuickTime X on Mac OS Ten point Six handles it, or like the Voice Memos app on the iPhone. Here's an example of both: Any help is appreciated. 回答1: I am not a UI programmer by any means. This was a test I wrote to see how to write custom controls. This code may or may not work. I have not touched it in some time. header @interface SUIMaxSlider : UIControl { @private float_t minimumValue; float