avfoundation

Why does averagePowerForChannel always return -160?

不想你离开。 提交于 2019-12-20 03:08:15
问题 I have these code and both method call are SUCCESS. AVAudioSession *audioSession = [AVAudioSession sharedInstance]; [audioSession setCategory: AVAudioSessionCategoryPlayAndRecord error: NULL]; [audioSession setActive: YES error: NULL]; And these code to start recording: [self.recorder prepareToRecord]; [self.recorder recordForDuration: 60]; I have a timer function to update meters - (void)updateMeters { [self.recorder updateMeters]; float peakPower = [self.recorder averagePowerForChannel: 0];

Create a silent audio CMSampleBufferRef

百般思念 提交于 2019-12-20 02:13:40
问题 How do you create a silent audio CMSampleBufferRef in Swift? I am looking to append silent CMSampleBufferRef s to an instance of AVAssetWriterInput . 回答1: You don't say what format you want your zeros (integer/floating point, mono/stereo, sample rate), but maybe it doesn't matter. Anyway, here's one way to create a silent CD audio style CMSampleBuffer in swift. func createSilentAudio(startFrm: Int64, nFrames: Int, sampleRate: Float64, numChannels: UInt32) -> CMSampleBuffer? { let

Can't get multiple images from video using MPMoviePlayerController. OSStatus -12433

早过忘川 提交于 2019-12-19 17:03:34
问题 I'm trying to extract multiple images from a selected video file using MPMoviePlayerController. Below is the code I have written. movie = [[MPMoviePlayerController alloc] initWithContentURL:[info objectForKey:UIImagePickerControllerMediaURL]]; NSNumber *time1 = [NSNumber numberWithInt:1]; NSNumber *time2 = [NSNumber numberWithInt:3]; NSNumber *time3 = [NSNumber numberWithInt:5]; NSArray *times = [NSArray arrayWithObjects:time1,time2,time3,nil]; [[NSNotificationCenter defaultCenter]

Simulate AVLayerVideoGravityResizeAspectFill: crop and center video to mimic preview without losing sharpness

岁酱吖の 提交于 2019-12-19 10:47:38
问题 Based on this SO post, the code below rotates, centers, and crops a video captured live by the user. The capture session uses AVCaptureSessionPresetHigh for the preset value, and the preview layer uses AVLayerVideoGravityResizeAspectFill for video gravity. This preview is extremely sharp. The exported video, however, is not as sharp, ostensibly because scaling from the 1920x1080 resolution for the back camera on the 5S to 320x568 (target size for the exported video) introduces fuzziness from

AVPlayer not playing m3u8 from local file

筅森魡賤 提交于 2019-12-19 10:34:23
问题 I am trying to get AVPlayer to play a m3u8 playlist that is a local file. I have narrowed this down to a simple test case using one of Apple's sample playlists: https://tungsten.aaplimg.com/VOD/bipbop_adv_fmp4_example/master.m3u8 If i play this playlist from the remote url, AVPlayer plays this fine. However, if i download this playlist to a local file, and then hand AVPlayer the local file URL, AVPlayer will not play it. It just shows the crossed out play symbol. Interestingly enough, this

How do I call CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer?

只愿长相守 提交于 2019-12-19 07:49:14
问题 I'm trying to figure out how to call this AVFoundation function in Swift. I've spent a ton of time fiddling with declarations and syntax, and got this far. The compiler is mostly happy, but I'm left with one last quandary. public func captureOutput( captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection! ) { let samplesInBuffer = CMSampleBufferGetNumSamples(sampleBuffer) var audioBufferList: AudioBufferList var

How to convert code objective c to Swift to save image?

丶灬走出姿态 提交于 2019-12-19 05:51:34
问题 I have seen this code in other post, for save pictures: // Create path. NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *filePath = [[paths objectAtIndex:0] stringByAppendingPathComponent:@"Image.png"]; // Save image. [UIImagePNGRepresentation(image) writeToFile:filePath atomically:YES]; An d I'm trying convert to swift for save a picture take with avfoundatioin but I dont know type NSDocumentDirectory and NSUserDomainMask here How

Record and play audio simultaneously in iOS

帅比萌擦擦* 提交于 2019-12-19 05:45:05
问题 I am trying to play the recorded content simultaneously while recording. Currently I am using AVAudioRecorder for recording and AVAudioPlayer for playing. When I was trying to play the content simultaneously nothing is playing. Please find the pseudo code for what I am doing. If I do the same stuff after stop the recording everything works fine. AVAudioRecorder *recorder; //Initializing the recorder properly. [recorder record]; NSError *error=nil; NSUrl recordingPathUrl; //Contains the

ios AVCaptureVideoPreviewLayer capture current image

家住魔仙堡 提交于 2019-12-19 05:06:09
问题 Once the default iPhone Camera app takes a photo, a preview appears and the image animates to the camera roll button. I am trying to replicate this animation. session = [[AVCaptureSession alloc] init]; session.sessionPreset = AVCaptureSessionPresetPhoto; CALayer *viewLayer = self.vImagePreview.layer; NSLog(@"viewLayer = %@", viewLayer); captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session]; captureVideoPreviewLayer.frame = CGRectMake(0, 0, 322, 425); [self

How seperate y-planar, u-planar and uv-planar from yuv bi planar in ios?

一笑奈何 提交于 2019-12-19 04:09:48
问题 In application i used AVCaptureVideo. i got video in kCVPixelFormatType_420YpCbCr8BiPlanarFullRange format. now i am getting y-planar and uv-planar from imagebuffer. CVPlanarPixelBufferInfo_YCbCrBiPlanar *planar = CVPixelBufferGetBaseAddress(imageBuffer); size_t y-offset = NSSwapBigLongToHost(planar->componentInfoY.offset); size_t uv-offset = NSSwapBigLongToHost(planar->componentInfoCbCr.offset); here yuv is biplanar format(y+UV). what is UV-planar? is this uuuu,yyyy format or uvuvuvuv format