avfoundation

SPS values for H 264 stream in iPhone

 ̄綄美尐妖づ 提交于 2019-12-22 00:33:35
问题 Can someone point me to documentation that will help me get correct SPS and PPS values for iPhone. 回答1: Question is a bit unclear... Picture Parameter Set is described in the latest ITU-T release of the standard in chapter 7.3.2.2 Sequence Parameter Set is described in chapter 7.3.2.1. 回答2: You can encode a single frame to a file and then extract the sps and pps from that file. I have an example that shows how to do exactly that at http://www.gdcl.co.uk/2013/02/20/iOS-Video-Encoding.html 回答3:

Swift audio recording and tableview display

☆樱花仙子☆ 提交于 2019-12-22 00:04:15
问题 I am having trouble recording audio and displaying it in a tableview. I am able to record and immediately play it back, but the audio doesn't seem to actually be stored to the device permanently, so I am unable to call it from the tableview. The directory also seems to change each time the app is open. How can I correct my code for permanent save and recall when populating tableview rows? func record() { let audioSession:AVAudioSession = AVAudioSession.sharedInstance() if (audioSession

Remove audio from a video file

我的梦境 提交于 2019-12-21 23:29:28
问题 I'm trying to remove the audio track from a MOV type video in my phone's library. I know I can mute the audio on playback, but I plan to upload user videos and it would just make sense to take the audio out and reduce file size. I've tried converting obj-c code from THIS ANSWER to swift, but I either messed up with conversion or it just doesn't take out the audio from the file. Any help would be greatly appreciated. 回答1: The top upvoted answer didn't work for me + I had issues with video

AVCaptureSession preset creates a photo that is too big

久未见 提交于 2019-12-21 22:19:24
问题 I have a photo taking app that is using AVFoundation . Once an image is captured, I have 2 core graphics methods that I have to run in order to properly rotate and crop the image. When testing this using the typical AVFoundation setup for capturing camera photos with a session preset of AVCaptureSessionPresetPhoto , I was always receiving a ton of memory warnings and I spent 2 weeks trying to solve these and I finally gave up. I ended up ditching the typical

Is it possible to add own metadata in captured Images in Swift

允我心安 提交于 2019-12-21 22:01:28
问题 I'm very new to Swift and Ios programming. I like to, as mentioned above, insert my own metadata to captured images before i save them to album. I'm trying to get this done with this code. The saved image does not contain my own metadata, but its generated metadata. Can anybody please tell me what I'm doing wrong? Or maybe isn't it possible to add own new metadata table to captured images? Thanks a lot for your help @IBAction func btnPressed(sender: UIButton) { capturePicture() } func

How to get CMSampleBufferRef from AudioQueueBufferRef

两盒软妹~` 提交于 2019-12-21 21:31:06
问题 I am using a private library which was made for live broadcasting from iPhone. In every time of recording each frame it call a delegate function void MyAQInputCallback(void *inUserData, AudioQueueRef inQueue, AudioQueueBufferRef inBuffer, const AudioTimeStamp *inStartTime, UInt32 inNumPackets, const AudioStreamPacketDescription *inPacketDesc); Now how I can append this inBuffer to my AVAssetWriterInput as usual: [self.audioWriterInput appendSampleBuffer:sampleBuffer]; I think maybe convert

UIImage Orientation Swift

 ̄綄美尐妖づ 提交于 2019-12-21 21:24:30
问题 I have written this code to capture an image using the AVFoundation library in Swift: @IBAction func cameraButtonWasPressed(sender: AnyObject) { if let videoConnection = stillImageOutput.connectionWithMediaType(AVMediaTypeVideo){ stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection){ (imageSampleBuffer : CMSampleBuffer!, _) in let imageDataJpeg = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageSampleBuffer) var pickedImage: UIImage = UIImage(data:

How to convert NSData object with JPEG data into CVPixelBufferRef in OS X?

爱⌒轻易说出口 提交于 2019-12-21 20:45:31
问题 As the title mentioned. I search with Google and Stackoverflow, all resources are for UIImage converting, or convert NSImage FROM CVPixelBufferRef. Now what I want to do is convert JPEG raw data TO CVPixelBufferRef so that I could generate a movie file with live jpeg streams. 回答1: You can use the following method to convert from NSImage to CVPixelBufferRef : - (CVPixelBufferRef)newPixelBufferFromNSImage:(NSImage*)image { CGColorSpaceRef colorSpace = CGColorSpaceCreateWithName

how to monitor audio input on ios using swift - example?

只谈情不闲聊 提交于 2019-12-21 20:29:00
问题 I want to write a simple app that 'does something' when the sound level at the mic reaches a certain level, showing the audio input levels for extra credit cant find any examples in swift that get to this -- dont want to record, just monitor have been checking out the docs on the AVFoundation classes but cant get off the ground thanks 回答1: Let you can use below code : func initalizeRecorder () { do { try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord) try

How to route to kAudioSessionProperty_OverrideCategoryEnableBluetoothInput without using AudioSessionSetProperty

萝らか妹 提交于 2019-12-21 20:24:27
问题 My iOS6 and working code to set bluetooth as an output: // create and set up the audio session AVAudioSession* audioSession = [AVAudioSession sharedInstance]; [audioSession setCategory: AVAudioSessionCategoryPlayAndRecord error: nil]; [audioSession setActive: YES error: nil]; // set up for bluetooth microphone input UInt32 allowBluetoothInput = 1; OSStatus stat = 0; stat = AudioSessionSetProperty ( kAudioSessionProperty_OverrideCategoryEnableBluetoothInput, sizeof (allowBluetoothInput),