avfoundation

No audio in video recording (using GPUImage) after initializing The Amazing Audio Engine

走远了吗. 提交于 2019-12-06 14:38:33
I'm using two third party tools in my project. One is "The Amazing Audio Engine" . I use this for audio filters. The other is GPUImage, or more specifically, GPUImageMovieWriter . When I record videos, I merge an audio recording with the video. This works fine. However, sometimes I do not use The Amazing Audio Engine and just record a normal video using GPUImageMovieWriter. The problem is, even just after initializing The Amazing Audio Engine, the video has only a fraction of a second of audio at the beginning, and then the audio is gone. + (STAudioManager *)sharedManager { static

Speech Synthesis on iOS weird errors on loading, and no concurrency

為{幸葍}努か 提交于 2019-12-06 14:05:01
I'm using the speech synth in AVFoundation, creating an instance of a voice like this: import AVFoundation class CanSpeak { let voices = AVSpeechSynthesisVoice.speechVoices() let voiceSynth = AVSpeechSynthesizer() var voiceToUse: AVSpeechSynthesisVoice? init(){ for voice in voices { if voice.name == "Arthur" { voiceToUse = voice } } } func sayThis(_ phrase: String){ let utterance = AVSpeechUtterance(string: phrase) utterance.voice = voiceToUse utterance.rate = 0.5 voiceSynth.speak(utterance) } } I have two problems. There's no concurrency. Calling this function multiple times results in a

Preloading video to play without delay

左心房为你撑大大i 提交于 2019-12-06 13:55:03
问题 there are tons of topics on SO about video preloading but still isn't crystal clear for me. Objectives: Load video from the network, URL is given Wait for video loaded completely Play video without delay (as I said it's already buffered 100%) Ideally, calculate download speed, predict f.e when buffered 60% of video, we start playing and 40% will be buffered while playing without delay. what I tried: NSURL *url = [NSURL URLWithString:@"video url address here"]; AVURLAsset *avasset = [

How to extract motion vectors from H.264 AVC CMBlockBufferRef after VTCompressionSessionEncodeFrame

一世执手 提交于 2019-12-06 13:51:53
问题 I'm trying read or understand CMBlockBufferRef representation of H.264 AVC 1/30 frame. The buffer and the encapsulating CMSampleBufferRef is created by using VTCompressionSessionRef . https://gist.github.com/petershine/de5e3d8487f4cfca0a1d H.264 data is represented as AVC memory buffer, CMBlockBufferRef from the compressed sample. Without fully decompressing again , I'm trying to extract motion vectors or predictions from this CMBlockBufferRef . I believe that for the fastest performance,

Method captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection only called a few times

为君一笑 提交于 2019-12-06 13:37:49
I'm capturing audio from external bluetooth microphone. But I can't record anything. This method is only called one time, at the beginning of the current AvCaptureSession. - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection After that I never get called this method for process the audio. For instantiate the capture session I do this: self.captureSession.usesApplicationAudioSession = true; self.captureSession.automaticallyConfiguresApplicationAudioSession = true; [[AVAudioSession

IOS adding UIProgressView to AVFoundation AVCaptureMovieFileOutput

a 夏天 提交于 2019-12-06 13:35:55
问题 I am using AVCaptureMovieFileOutput to record videos and I want to add a UIProgressView to represent how much time there is left before the video stops recording. I set a max duration of 15 seconds: CMTime maxDuration = CMTimeMakeWithSeconds(15, 50); [[self movieFileOutput] setMaxRecordedDuration:maxDuration]; I can't seem to find if AVCaptureMovieFileOutput has a callback for when the video is recording or for when recording begins. My question is, how can I get updates on the progress of

IOS Toggle AVFoundation Camera

▼魔方 西西 提交于 2019-12-06 13:24:52
In my App I'm capturing images using AVFoundation I made a button to switch between front and back cameras but it won't work. Here's the code I used : if (captureDevice.position == AVCaptureDevicePositionFront) { for ( AVCaptureDevice *device in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] ) { if ( device.position == AVCaptureDevicePositionBack) { NSError * error; AVCaptureDeviceInput * newDeviceInput = [[AVCaptureDeviceInput alloc]initWithDevice:device error:&error]; [captureSesion beginConfiguration]; for (AVCaptureDeviceInput *oldInput in [captureSesion inputs]) { [captureSesion

AVCaptureDevice videoZoomFactor always Out of Range

守給你的承諾、 提交于 2019-12-06 12:25:41
I'm trying to set the zoom level of a camera by this code: AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; if ([videoDevice lockForConfiguration:nil]) { float newzoom=1.3; videoDevice.videoZoomFactor = newzoom; [videoDevice unlockForConfiguration]; } This code doesn't not works in ios 7(it works in ios 9), it cause always an exception: Terminating app due to uncaught exception 'NSRangeException', reason: 'videoZoomFactor out of range' I can't find information but the zoom range in ios 7 seems to be "from 1 to 2". But every value i have tried to set

iOS: Is there a performance difference between using playInputClick vs the (1104) sound file with audio toolbox?

半腔热情 提交于 2019-12-06 10:25:54
Apple recommends using playInputClick in custom keyboards to simulate a click sound. It's easier to implement AudioServicesPlaySystemSound(1104); so my question then becomes, does playInputClick provide better performance or is it the same thing? Reason Apple recommended this is probably not performance. AudioServicesPlaySystemSound(1104) will probably always play the same sound but playInputClick may play another sound in the future if Apple decides to change the input click sound. So they are the same right now but it might change and if it does your app will be the only one playing the old

Getting AVAudioPCMBuffer working (AVAudioFile.mm error code -50)

只愿长相守 提交于 2019-12-06 10:11:45
问题 I'm trying to set up a basic example in a Swift playground (code below) but have also tried it with Objective-C with the same result. import AVFoundation let fileURL = ... // have tried a wav file and a aiff file let myAudioFile = AVAudioFile(forReading: fileURL, error: nil) let myAudioFormat = myAudioFile.fileFormat let myAudioFrameCount = UInt32(myAudioFile.length) var myAudioBuffer = AVAudioPCMBuffer(PCMFormat: myAudioFormat, frameCapacity: myAudioFrameCount) // have also tried a smaller