avfoundation

AVCaptureMovieFileOutput never calls delegate on screen recording

一世执手 提交于 2021-01-29 08:51:08
问题 I was able to assemble a minimal Swift demo of screen recording for debugging another thing but something struct me - AVFoundation never calls my delegate for recording. None of them in fact. The code is pretty straight forward: class ExampleRecorder: NSObject, AVCaptureFileOutputRecordingDelegate { private var session: AVCaptureSession?; private var fileOut: AVCaptureMovieFileOutput?; func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections:

Converting TrueDepth data to grayscale image produces distorted image

梦想的初衷 提交于 2021-01-29 07:37:35
问题 I'm getting the depth data from the TrueDepth camera, and converting it to a grayscale image. (I realize I could pass the AVDepthData to a CIImage constructor, however, for testing purposes, I want to make sure my array is populated correctly, therefore manually constructing an image would ensure that is the case.) I notice that when I try to convert the grayscale image, I get weird results. Namely, the image appears in the top half, and the bottom half is distorted (sometimes showing the

AVAudioEngine no sound

时光总嘲笑我的痴心妄想 提交于 2021-01-29 07:28:54
问题 I have been trying for a while now how to stream real time audio from Data provided from a URLSessionDataTask in iOS. I have declared a custom class for managing the player actions and it looks like this: import UIKit import AVFoundation class AudioDataPlayer: NSObject { //MARK:- Variables //MARK: Constants enum Status{ case playing case notPlaying } let audioPlayerQueue = DispatchQueue(label: "audioPlayerQueue", qos: DispatchQoS.userInteractive) //MARK: Vars private (set) var currentStatus

Audio Missing when Adding Text on pixelBuffer

牧云@^-^@ 提交于 2021-01-28 08:18:26
问题 I am trying to add text overlay on video, When recording in iPhone 5s or lower devices in High quality and writing text on that then after 1 or 2 seconds audio goes missing, But this doesn't happen on larger devices like iPhone 6/6s. If i remove that text writer method then it works properly on all devices or if i reduce the video quality in 5s then also it works fine. How i can get video with audio in iPhone 5s with overlay text. Here is my code import Foundation import AVFoundation import

AVPlayerItemVideoOutput copyPixelBuffer always returns 1280x720

喜夏-厌秋 提交于 2021-01-27 21:56:43
问题 Im instanciating the AVPlayerItemVideoOutput like so: let videoOutput = AVPlayerItemVideoOutput(pixelBufferAttributes: [String(kCVPixelBufferPixelFormatTypeKey): NSNumber(value: kCVPixelFormatType_32BGRA)]) And retrieving the pixelBuffers like this: @objc func displayLinkDidRefresh(link: CADisplayLink) { let itemTime = videoOutput.itemTime(forHostTime: CACurrentMediaTime()) if videoOutput.hasNewPixelBuffer(forItemTime: itemTime) { if let pixelBuffer = videoOutput.copyPixelBuffer(forItemTime:

Can AVAudioSession be activated in Background state without MixWithOthers?

☆樱花仙子☆ 提交于 2021-01-27 17:28:46
问题 So my app is running in the background because of CoreLocation that has requested AlwaysAuthorization. This prevents app to reach suspended state. It stays in background state and is receiving location events. After some location events I would like to activate AVAudioSession and play some sounds. How to activate session properly (in this background app) when I'm using other app in foreground right now and this app is playing audio track for example. Suppose I'm watching some video in Youtube

rtsp:// liveStream with AVPlayer

两盒软妹~` 提交于 2021-01-27 10:33:05
问题 I want to play liveStream on iPhoneDevice with AVPlayer. Also i want to get CVPixelBufferRef from this stream for next usage. I use Apple guide for creating player. Currently with locally stored videoFiles this player works just fine, also when i try to play this AppleSampleStremURL - http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8 - its work sine too. Problems appear when i want to play stream with rtsp:// like this one: rtsp://192.192.168.1:8227/TTLS/Streaming/channels/2

rtsp:// liveStream with AVPlayer

与世无争的帅哥 提交于 2021-01-27 10:31:13
问题 I want to play liveStream on iPhoneDevice with AVPlayer. Also i want to get CVPixelBufferRef from this stream for next usage. I use Apple guide for creating player. Currently with locally stored videoFiles this player works just fine, also when i try to play this AppleSampleStremURL - http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8 - its work sine too. Problems appear when i want to play stream with rtsp:// like this one: rtsp://192.192.168.1:8227/TTLS/Streaming/channels/2

GPUImageMovieWriter - occasional black frames at either ends of the recorded video

|▌冷眼眸甩不掉的悲伤 提交于 2021-01-27 07:07:52
问题 I have recording app implementation where user can tap the "record" button to start/stop recording. I achieve this with a basic GPUImageVideoCamera with output set to a GPUImageView as well as a GPUImageMovieWriter . 50% of the time, the recorded clip ends up with a couple (or a single) black frame at either ends, sometimes both. The implementation is fairly straightforward, but here is it anyway. gpuImageView = [[GPUImageView alloc] initWithFrame:cameraView.frame]; gpuImageView.fillMode =

AVMutableComposition Video Black at Start

别来无恙 提交于 2021-01-27 04:51:09
问题 I'm using AVMutableComposition and AVAssetExportSession to trim a video down. Randomly, and I mean randomly (I cannot consistently reproduce) users' videos have a few black frames at the start of the trimmed video. The audio is unaffected. I can confirm 100% that the videos being trimmed don't have anything to do with it, as this happens for a wide variety of videos from all different sources. Any insight into why these videos are being exported with black frames in the start would be very