avfoundation

Pause & resume video capture for same file with AVFoundation in iOS

。_饼干妹妹 提交于 2020-01-01 04:58:09
问题 I'm trying to figure out how I can implement functionality to repeatedly pause and resume video capture in a single session, but have each new segment (the captured segments after each pause) added to the same video file, with AVFoundation. Currently, every time I press "stop" then "record" again, it just saves a new video file to my iphone's album and starts capturing to a separate/new file. I need to be able to press the "record/stop" button over and over... only capture video & audio when

Best path from AVPlayerItemVideoOutput to openGL Texture

孤街浪徒 提交于 2020-01-01 03:27:26
问题 Been pulling my hair out trying to figure out the current best path from AVFoundation videos to an openGLTexture, most of what I find is related to iOS, and I can't seem to make it work well in OSX. First of all, this is how I set up the videoOutput: NSDictionary *pbOptions = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_422YpCbCr8], kCVPixelBufferPixelFormatTypeKey, [NSDictionary dictionary], kCVPixelBufferIOSurfacePropertiesKey, nil]; self

How do you get smooth high-rate playback from AVPlayer?

£可爱£侵袭症+ 提交于 2020-01-01 03:01:30
问题 AVPlayer has a property called rate that is meant to control the playback rate. 1.0 is normal speed while values like 2.0 or 5.0 should playback at 2x and 5x respectively. Whenever I set a playback rate value higher than 1.0 (say 10.0 ), the playback is very choppy and it looks like a large number of frames are getting dropped as the player can't keep up. However, the same values in QuickTime Player (with the same movie), produce smooth playback for rates of 2x, 5x, 10x, 30x and 60x (as

Switch front/back camera with AVCaptureSession

自闭症网瘾萝莉.ら 提交于 2020-01-01 02:17:08
问题 I'm following the only answer this has on SO - Switch cameras with avcapturesession However cameraWithPosition does not seem to work. Deprecated? //Get new input AVCaptureDevice *newCamera = nil; if(((AVCaptureDeviceInput*)currentCameraInput).device.position == AVCaptureDevicePositionBack) { newCamera = [self cameraWithPosition:AVCaptureDevicePositionFront]; } else { newCamera = [self cameraWithPosition:AVCaptureDevicePositionBack]; } 回答1: What you need to do is reconfigure your

Record Audio/Video with AVCaptureSession and Playback Audio simultaneously?

独自空忆成欢 提交于 2020-01-01 01:16:52
问题 I'm trying to create an iOS app which can record audio and video while simultaneously outputting audio to the speakers. To do the recording and preview, I'm using AVCaptureSession , an AVCaptureConnection each for both video and audio, and an AVAssetWriterInput each for both video and audio. I basically achieved this by following the RosyWriter example code. Prior to setting up recording in this fashion, I was using AVAudioPlayer to play audio. Now, if I am in the middle of capturing (not

How to extract metadata from audio files on iOS

狂风中的少年 提交于 2019-12-31 23:00:34
问题 I'm trying to extract metadata from mp3 and m4a files using the AVFoundation framework. This is the test code: + (void)printMetadataForFileAtPath:(NSString *)path { NSURL *url = [NSURL fileURLWithPath:path]; AVAsset *asset = [AVURLAsset assetWithURL:url]; NSArray *availableFormats = [asset availableMetadataFormats]; NSLog(@"Available formats: %@", availableFormats); NSArray *iTunesMetadata = [asset metadataForFormat:AVMetadataFormatiTunesMetadata]; for (AVMetadataItem *item in iTunesMetadata)

Set AVAudioEngine Input and Output Devices

霸气de小男生 提交于 2019-12-31 22:26:29
问题 I've been playing around with Apple's shiny new AVFoundation library, but so far I've unable to set the input or output devices (e.g. a USB sound card) used by an AVAudioEngine , and I can't seem to find anything in the documentation to say it's even possible. Does anyone have any experience with this? 回答1: Ok, after re-reading the docs for the 10th time, I noticed AVAudioEngine has members inputNode and outputNode (not sure how I missed that!). The following code seems to do the job:

First frame of a video using AVFoundation

霸气de小男生 提交于 2019-12-31 10:23:10
问题 I'm trying to get the first frame of a video using the classes in AVFoundation. But it appears to not be getting an image at all. My code currently looks like this AVURLAsset* asset = [AVURLAsset URLAssetWithURL:[NSURL URLWithString:videoPath] options:nil]; AVAssetImageGenerator* imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset]; UIImage* image = [UIImage imageWithCGImage:[imageGenerator copyCGImageAtTime:CMTimeMake(0, 1) actualTime:nil error:nil]]; [videoFrame

How to get Bytes from CMSampleBufferRef , To Send Over Network

那年仲夏 提交于 2019-12-31 07:56:52
问题 Am Captuing video using AVFoundation frame work .With the help of Apple Documentation http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/03_MediaCapture.html%23//apple_ref/doc/uid/TP40010188-CH5-SW2 Now i did Following things 1.Created videoCaptureDevice 2.Created AVCaptureDeviceInput and set videoCaptureDevice 3.Created AVCaptureVideoDataOutput and implemented Delegate 4.Created AVCaptureSession - set input as AVCaptureDeviceInput and set

AVAudioPlayer Play audio on music plays on the sound box of phone calls

一笑奈何 提交于 2019-12-31 07:32:48
问题 I have a simple code using AVAudioPlayer, to play a .caf audio: AppDelegate.h AVAudioPlayer *_audioPlayer; AppDelegate.m - (void)playAudio{ NSArray *dirPaths; NSString *docsDir; dirPaths = NSSearchPathForDirectoriesInDomains( NSDocumentDirectory, NSUserDomainMask, YES); docsDir = dirPaths[0]; NSString *soundFilePath = [docsDir stringByAppendingPathComponent:audiosrc]; NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath]; _audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL