avkit

Disable gesture recognizer in AVPlayerViewController

帅比萌擦擦* 提交于 2019-12-11 04:47:55
问题 In AVPlayerViewController there is a feature for stopping the playback of a video and closing the AVPlayerViewController by swiping its view. I want to disable this feature. I guess I need to disable a gesture recognizer!? But I don’t know how to do this for the player. 回答1: I recently stumbled upon a similar problem. You can access the gesture recognizers from the contentView of AVPlayerViewController . If you wanted to keep only the tap gesture recognizer, you might want to use a function

How to disable Picture in Picture mode for default video player

两盒软妹~` 提交于 2019-12-10 23:57:57
问题 How can I disable Picture in Picture button/mode using Swift in iOS9 on iPad when user tries to play a video in my app? 回答1: Default AVPlayerLayers won't use PiP unless you use an AVPictureInPictureController. AVPlayerViewController has a property allowsPictureInPicture which you can set to false . 来源: https://stackoverflow.com/questions/32726385/how-to-disable-picture-in-picture-mode-for-default-video-player

AVPlayer / AVPlayerLayer not appearing in subview using auto layout

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-10 15:17:28
问题 I'm trying to display a video inside of a subview which contains the video and some text. I used the recommended UIView subclass from Apple to create a UIView to contain my AVPlayer: import Foundation import AVFoundation class JSAVPlayerView : UIView { var player:AVPlayer? { get { return (self.layer as! AVPlayerLayer).player } set(newPlayer) { (self.layer as! AVPlayerLayer).player = newPlayer } } override class func layerClass() -> AnyClass { return AVPlayerLayer.self } } I used

Square video using AVFoundation

淺唱寂寞╮ 提交于 2019-12-09 00:17:58
问题 I followed the given tutorial to create a custom square video recording camera. http://www.netwalk.be/article/record-square-video-ios I am able to export a square video from this, but when i try to play the newly exported file using the url it does not play. For the original url ,i tried was working fine and playing properly. I opened the documents directory of the phone and found that the video was getting cropped and a file was created but playing the video on quicktimeplayer it was

AVURLAsset not loading video on documents folder, even using fileURLWithPath

安稳与你 提交于 2019-12-07 06:05:38
问题 been struggling with this for the past couple of hours, hopefully someone has run into it before I download a file that from a server to my documents folder File is there and valid (checked with iExplorer on device and the local directory of the simulator). Moved each file to my desktop and plays without problems. The strange thing is that the exact same code works without issues when the file (the same video) is added to the bundled. code: print("video url string : \(video.urlString)") //

How to add external WebVTT subtitles into HTTP Live Stream on iOS client

浪子不回头ぞ 提交于 2019-12-04 07:14:55
We have videos encoded via bitmovin.com and provided as HTTP Live Streams (Fairplay HLS), but subtitles although in WebVTT format are exposed separately as direct URLs for the whole file, not individual segments and are not part of the HLS m3u8 playlist. I am looking for the way how an external .vtt file downloaded separately can still be included in the HLS stream and be available as a subtitle in AVPlayer. I know Apple's recommendation is to include segmented VTT subtitles into the HLS playlist, but I can't change the server implementation right now, so I want to clarify if it is even

Saving video from CMSampleBuffer while streaming using ReplayKit

点点圈 提交于 2019-12-03 14:43:14
问题 I'm streaming a content of my app to my RTMP server and using RPBroadcastSampleHandler. One of the methods is override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) { switch sampleBufferType { case .video: streamer.appendSampleBuffer(sampleBuffer, withType: .video) captureOutput(sampleBuffer) case .audioApp: streamer.appendSampleBuffer(sampleBuffer, withType: .audio) captureAudioOutput(sampleBuffer) case .audioMic: () } } And the

Saving video from CMSampleBuffer while streaming using ReplayKit

牧云@^-^@ 提交于 2019-12-03 08:03:45
I'm streaming a content of my app to my RTMP server and using RPBroadcastSampleHandler. One of the methods is override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) { switch sampleBufferType { case .video: streamer.appendSampleBuffer(sampleBuffer, withType: .video) captureOutput(sampleBuffer) case .audioApp: streamer.appendSampleBuffer(sampleBuffer, withType: .audio) captureAudioOutput(sampleBuffer) case .audioMic: () } } And the captureOutput method is self.lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); // Append

AVPlayerViewController doesn't play local videos

て烟熏妆下的殇ゞ 提交于 2019-12-02 07:37:26
I have added a DemoVideo.mp4 in my project and added it to copy resource file. But when i run the app it doesn't play my video. Here is my method. private func setUpAndPlayVideo() { guard let videoPath = Bundle.main.path(forResource: "DemoVideo.mp4", ofType: nil) else { return } let videoURL = NSURL(string: videoPath) let player = AVPlayer(url: videoURL! as URL) playerViewController = AVPlayerViewController() playerViewController.player = player playerViewController.view.frame = self.videoPlayerView.bounds self.videoPlayerView.addSubview(playerViewController.view) playerViewController

How can I get Camera Calibration Data on iOS? aka AVCameraCalibrationData

做~自己de王妃 提交于 2019-12-02 07:15:40
As I understand it, AVCameraCalibrationData is only available over AVCaptureDepthDataOutput. Is that correct? AVCaptureDepthDataOutput on the other hand is only accessible with iPhone X front cam or iPhone Plus back cam, or am I mistaken? What I am trying to do is to get the FOV of an AVCaptureVideoDataOutput SampleBuffer. Especially, it should match the selected preset (full HD, Photo etc.). rickster You can get AVCameraCalibrationData only from depth data output or photo output. However, if all you need is FOV, you need only part of the info that class offers — the camera intrinsics matrix —