avfoundation

AVAudioRecorder records only the audio after interruption

狂风中的少年 提交于 2019-12-21 04:44:06
问题 In my application for recording and playing audio using AVAudioRecorder and AVAudioPlayer I came across a scenario in the case of incoming phone call.While the recording is in progress and if the phone call comes,the audio recorded after the phone call is only recorded.I want the recording recorded after the phone call to be the continuation of the audio recorded before the phone call. I track the interruption occuring in audio recorder using the AVAudioRecorderDelegate methods (void

How does AVPlayer / AVPlayerItem inform my application about unreachable network failure?

依然范特西╮ 提交于 2019-12-21 04:38:09
问题 I’m using AVPlayer to implement a custom video player in an iOS application. To play video from the network I allocate a player: [[AVPlayer alloc] initWithURL:_URL]; Create an asset: AVURLAsset *asset = [AVURLAsset URLAssetWithURL:self.URL options:@{AVURLAssetPreferPreciseDurationAndTimingKey : @(YES)}]; load the playable key asynchronously: NSArray *keys = @[@"playable"]; [asset loadValuesAsynchronouslyForKeys:keys completionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^{ for

iOS 11 AVPlayer crash when KVO

爷,独闯天下 提交于 2019-12-21 03:58:06
问题 I got a weird crash when using AVPlayer to play a remote video. From the crash log on Fabric , the App crash on system thread ( com.apple.avfoundation.playerlayer.configuration ). The crash log is below: Crashed: com.apple.avfoundation.playerlayer.configuration 0 libsystem_kernel.dylib 0x1839ac2e8 __pthread_kill + 8 1 libsystem_pthread.dylib 0x183ac12f8 pthread_kill$VARIANT$mp + 396 2 libsystem_c.dylib 0x18391afbc abort + 140 3 libsystem_malloc.dylib 0x1839e3ce4 szone_size + 634 4 QuartzCore

How to move video from application documents directory to camera roll?

落爺英雄遲暮 提交于 2019-12-21 03:01:39
问题 hallo, i have found some iphone-camera-video-saving avfoundation code examples over the net (and also im tryin to write my own code but not finished it yet) such examples process and do save the video (through AVAssetWriter) from the camera capture input to file located in documments directory (which i assume is the only option?) but now - i do not even can see files in such directory to check if my video is there; i think i should move such video file to 'camera roll', how to do that?? tnx

Mixing two Audio Files using AVComposition on IOS

放肆的年华 提交于 2019-12-21 03:01:14
问题 I'm trying to mix two audio files (lay one audio file on top of the other - not stitched together) but I'm struggling with learning AVFoundation on IOS. I've followed this answer here: How to merge Audio and video using AVMutableCompositionTrack And this is what I have : //NSURL *audioFilePath = [NSURL fileURLWithPath:@"var/mobile/Applications/822732B6-67B9-485F-BA44-FAACAB34C4FD/Documents/Coisir Cheoil10_09_2014_1429.m4a"]; NSURL *audioUrl = [NSURL fileURLWithPath:@"var/mobile/Applications

Observing values on AVPlayerItem in iOS9

馋奶兔 提交于 2019-12-21 02:47:12
问题 I have an app that uses AVPlayer to play an AVPlayerItem (video) from a remote URL. In iOS 6-8 I have been observing the AVPlayerItem's value for loadedTimeRanges to notify me when the playerItem is ready to be played by the player. This also works when observing the value for the item's duration , I believe. After updating to iOS 9 beta, none of the values on AVPlayerItem I observe ever makes it to the observeValueForKeyPath -method. Just as if I'm not observing them at all. I am still being

streaming images over bonjour between two iOS device

三世轮回 提交于 2019-12-21 02:41:39
问题 My goal is to stream images captured by AVCpatureInput from one iOS device to another via bonjour. Here is my current method: 1) Capture frame from video input - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { /*code to convert sampleBuffer into UIImage */ NSData * imageData = UIImageJPEGRepresentation(image,1.0); [connection sendImage:image]; } 2) Send over TCP connection (from http:

Is it possible to do video file “chunking”/segmenting while still recording using AVFoundation?

删除回忆录丶 提交于 2019-12-21 02:05:34
问题 I am attempting to use AVFoundation to record video on OS X but it waits till then end of the recording to save the file. I want to be able to have it save whatever it has captured every 5/10/X seconds. I need to do this so that as it saves off the video files I can stream the segments to a server while the video is still recording so I can server up "almost live" video off the server. Thanks for any help you may be able to provide! 回答1: You can ask AVFoundation to vend the frames to you as

Render dynamic text onto CVPixelBufferRef while recording video

喜夏-厌秋 提交于 2019-12-21 01:12:51
问题 I'm recording video and audio using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput and in the captureOutput:didOutputSampleBuffer:fromConnection: delegate method, I want to draw text onto each individual sample buffer I'm receiving from the video connection. The text changes with about every frame (it's a stopwatch label) and I want that to be recorded on top of the video data that's captured. Here's what I've been able to come up with so far: //1. CVPixelBufferRef pixelBuffer =

Render dynamic text onto CVPixelBufferRef while recording video

北战南征 提交于 2019-12-21 01:12:30
问题 I'm recording video and audio using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput and in the captureOutput:didOutputSampleBuffer:fromConnection: delegate method, I want to draw text onto each individual sample buffer I'm receiving from the video connection. The text changes with about every frame (it's a stopwatch label) and I want that to be recorded on top of the video data that's captured. Here's what I've been able to come up with so far: //1. CVPixelBufferRef pixelBuffer =