cmsamplebuffer

Play audio from CMSampleBuffer

百般思念 提交于 2021-02-18 10:47:29
问题 I have created a video chat app for groups in iOS. I have been searching for some ways to control the audio volume for different participant separately. I found way to mute and unmute using isPlaybackEnabled in RemoteAudioTrack , but not to control volume. I also thought if we can use it in AVAudioPlayer . I found addSink . This is what I tried from here: class Audio: NSObject, AudioSink { var a = 1 func renderSample(_ audioSample: CMSampleBuffer!) { print("audio found", a) a += 1 var

How do I convert a CVPixelBuffer / CVImageBuffer to Data?

五迷三道 提交于 2021-02-10 15:57:14
问题 My camera app captures a photo, enhances it in a certain way, and saves it. To do so, I get the input image from the camera in the form of a CVPixelBuffer (wrapped in a CMSampleBuffer). I perform some modifications on the pixel buffer, and I then want to convert it to a Data object. How do I do this? Note that I don't want to convert the pixel buffer / image buffer to a UIImage or CGImage since those don't have metadata (like EXIF). I need a Data object. How do I get one from a CVPixelBuffer

How do I convert a CVPixelBuffer / CVImageBuffer to Data?

南笙酒味 提交于 2021-02-10 15:55:12
问题 My camera app captures a photo, enhances it in a certain way, and saves it. To do so, I get the input image from the camera in the form of a CVPixelBuffer (wrapped in a CMSampleBuffer). I perform some modifications on the pixel buffer, and I then want to convert it to a Data object. How do I do this? Note that I don't want to convert the pixel buffer / image buffer to a UIImage or CGImage since those don't have metadata (like EXIF). I need a Data object. How do I get one from a CVPixelBuffer

Audio Missing when Adding Text on pixelBuffer

牧云@^-^@ 提交于 2021-01-28 08:18:26
问题 I am trying to add text overlay on video, When recording in iPhone 5s or lower devices in High quality and writing text on that then after 1 or 2 seconds audio goes missing, But this doesn't happen on larger devices like iPhone 6/6s. If i remove that text writer method then it works properly on all devices or if i reduce the video quality in 5s then also it works fine. How i can get video with audio in iPhone 5s with overlay text. Here is my code import Foundation import AVFoundation import

Saving a screen recording with RPScreenRecorder start capture

戏子无情 提交于 2020-05-28 05:30:07
问题 I am attempting to use RPScreenRecorder.shared().startCapture to save a screen recording to firebase. I know how to save videos from AVCapture but cant figure out how to process The CMSampleBuffer to create a file to save to firebase. Please help I cant find documentation on this anywhere yet, here is the method call: let recorder = RPScreenRecorder.shared() if #available(iOS 11.0, *) { recorder.startCapture(handler: { (videoBuffer, bufferType, error) in print(videoBuffer) print(bufferType) }

CMSampleBufferGetDataBuffer() returns nil value - Cocoa Swift

家住魔仙堡 提交于 2020-04-30 08:23:06
问题 I am trying to capture my system's screen and process the data. But I get nil value for CMSampleBufferGetDataBuffer for the sample buffer I get in captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) delegate method. Any idea? below is my code: import Cocoa import AVFoundation class ViewController: NSViewController, AVCaptureVideoDataOutputSampleBufferDelegate { private lazy var sampleBufferDelegateQueue = DispatchQueue(label:

Creating copy of CMSampleBuffer in Swift returns OSStatus -12743 (Invalid Media Format)

╄→гoц情女王★ 提交于 2020-01-24 05:46:05
问题 I am attempting to perform a deep clone of CMSampleBuffer to store the output of a AVCaptureSession . I am receiving the error kCMSampleBufferError_InvalidMediaFormat (OSStatus -12743) when I run the function CMSampleBufferCreateForImageBuffer . I don't see how I've mismatched the CVImageBuffer and the CMSampleBuffer format description. Anyone know where I've gone wrong? Her is my test code. func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer

How to fill audio AVFrame (ffmpeg) with the data obtained from CMSampleBufferRef (AVFoundation)?

那年仲夏 提交于 2020-01-13 09:44:07
问题 I am writing program for streaming live audio and video from webcamera to rtmp-server. I work in MacOS X 10.8, so I use AVFoundation framework for obtaining audio and video frames from input devices. This frames come into delegate: -(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer: (CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection , where sampleBuffer contains audio or video data. When I recieve audio data in the sampleBuffer , I'm

Adding filters to video with AVFoundation (OSX) - how do I write the resulting image back to AVWriter?

假装没事ソ 提交于 2020-01-10 02:07:16
问题 Setting the scene I am working on a video processing app that runs from the command line to read in, process and then export video. I'm working with 4 tracks. Lots of clips that I append into a single track to make one video. Let's call this the ugcVideoComposition. Clips with Alpha which get positioned on a second track and using layer instructions, is set composited on export to play back over the top of the ugcVideoComposition. A music audio track. An audio track for the

Set rate at which AVSampleBufferDisplayLayer renders sample buffers

做~自己de王妃 提交于 2019-12-19 04:23:43
问题 I am using an AVSampleBufferDisplayLayer to display CMSampleBuffers which are coming over a network connection in the h.264 format. Video playback is smooth and working correctly, however I cannot seem to control the frame rate. Specifically, if I enqueue 60 frames per second in the AVSampleBufferDisplayLayer it displays those 60 frames, even though the video is being recorded at 30 FPS. When creating sample buffers, it is possible to set the presentation time stamp by passing a timing info