avfoundation

I want to change the sample rate from my input node from 44100 to 8000

点点圈 提交于 2020-01-12 09:52:52
问题 I want to read the buffer from my microphone into an array, with 441khz its working fine but with the sample rate 8khz it comes to an error ERROR: >avae> AVAudioIONodeImpl.mm:884: SetOutputFormat: required condition is false: format.sampleRate == hwFormat.sampleRate 2016-11-26 19:32:40.674 Atem[5800:1168274] *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: format.sampleRate == hwFormat.sampleRate' with my following code : var

I want to change the sample rate from my input node from 44100 to 8000

折月煮酒 提交于 2020-01-12 09:52:28
问题 I want to read the buffer from my microphone into an array, with 441khz its working fine but with the sample rate 8khz it comes to an error ERROR: >avae> AVAudioIONodeImpl.mm:884: SetOutputFormat: required condition is false: format.sampleRate == hwFormat.sampleRate 2016-11-26 19:32:40.674 Atem[5800:1168274] *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: format.sampleRate == hwFormat.sampleRate' with my following code : var

AVPlayer currentTime update for a UISlider when ViewController load

此生再无相见时 提交于 2020-01-12 07:24:29
问题 I'm playing songs in AVPlayer . I have created a separate view controller for my media player and initialization, and all the methods that I'm using for the player (play, pause, repeat, shuffle) are there in the same view controller. I update a slider like this: [NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:@selector(sliderUpdate:) userInfo:nil repeats:YES];` - (void) sliderUpdate:(id) sender{ int currentTime = (int)((song.player.currentTime.value)/song.player.currentTime

AVPlayer currentTime update for a UISlider when ViewController load

为君一笑 提交于 2020-01-12 07:24:08
问题 I'm playing songs in AVPlayer . I have created a separate view controller for my media player and initialization, and all the methods that I'm using for the player (play, pause, repeat, shuffle) are there in the same view controller. I update a slider like this: [NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:@selector(sliderUpdate:) userInfo:nil repeats:YES];` - (void) sliderUpdate:(id) sender{ int currentTime = (int)((song.player.currentTime.value)/song.player.currentTime

AVCaptureMovieFileOutput - no active/enabled connections

*爱你&永不变心* 提交于 2020-01-12 06:54:29
问题 I am trying to record video in my iPhone app using AVFoundation. But whenever I click the Record button app crashes with this message -[AVCaptureMovieFileOutput startRecordingToOutputFileURL:recordingDelegate:] - no active/enabled connections. I know same question asked in SO, but none of its answers helped me. My problem is the same code works with another application perfectly, and when I try using exactly same code in this app - crashes. But still photo capture is working fine. Adding my

How to composite videos using multiple AVVideoCompositions

别说谁变了你拦得住时间么 提交于 2020-01-12 06:19:26
问题 I'm trying to figure out how to composite multiple videos (AVAssets) into a single video such that each of the videos goes through its own video composition. However, I can't see a way to accomplish this and was wondering if anyone had any ideas. Consider the following: The above picture illustrates what I'm trying to do. I want to take the video track from four different videos and merge them into a single video such that they play in a grid-like layout. Right now, I'm able to achieve this

How to use AVAssetReader and AVAssetWriter for multiple tracks (audio and video) simultaneously?

二次信任 提交于 2020-01-11 17:22:25
问题 I know how to use AVAssetReader and AVAssetWriter, and have successfully used them to grab a video track from one movie and transcode it into another. However, I'd like to do this with audio as well. Do I have to create and AVAssetExportSession after I've done with the initial transcode, or is there some way to switch between tracks while in the midst of a writing session? I'd hate to have to deal with the overhead of an AVAssetExportSession. I ask because, using the pull style method -

How many AVPlayers are allowed to be created at the same time?

走远了吗. 提交于 2020-01-11 02:26:07
问题 I have a collectionView and each cell has an AVPlayer which is set to play. So every cell is playing a video at the same time. It seems that iOS only allows you to play 16 videos at the same time. For example, look at my sample app below. Out of 50 cells , only 16 started playing a video. This number always stays the same. This happens on a iPhone 6s running iOS 10. In the Xcode simulator however all 50 videos start playing. This issue only happens on an actual device. Also, I get these two

Adding filters to video with AVFoundation (OSX) - how do I write the resulting image back to AVWriter?

假装没事ソ 提交于 2020-01-10 02:07:16
问题 Setting the scene I am working on a video processing app that runs from the command line to read in, process and then export video. I'm working with 4 tracks. Lots of clips that I append into a single track to make one video. Let's call this the ugcVideoComposition. Clips with Alpha which get positioned on a second track and using layer instructions, is set composited on export to play back over the top of the ugcVideoComposition. A music audio track. An audio track for the

Having trouble creating UIImage from CIImage in iOS5

烈酒焚心 提交于 2020-01-09 06:50:41
问题 I'm using the AVFoundation framework. In my sample buffer delegate I have the following code: -(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{ CVPixelBufferRef pb = CMSampleBufferGetImageBuffer(sampleBuffer); CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pb]; self.imageView.image = [UIImage imageWithCIImage:ciImage]; } I am able to use the CIImage to run the face detector etc. but