avfoundation

Can't get pixel data via CGDataProviderCopyData using AVCaptureVideoDataOutput in swift 2

旧巷老猫 提交于 2019-12-08 12:41:33
I'm working on updating this for swift 2.0 and I currently get fatal error: unexpectedly found nil while unwrapping an Optional value on line: let data = CGDataProviderCopyData(CGImageGetDataProvider(imageRef)) as! NSData func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) { print("Capture output running") let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) CVPixelBufferLockBaseAddress(imageBuffer!, 0) let baseAddress = CVPixelBufferGetBaseAddressOfPlane(imageBuffer!, 0) let

change sound pitch (not in realtime)

独自空忆成欢 提交于 2019-12-08 12:35:38
问题 i alway had this question in my mind, but wherever i asked, i could never get an answer or a suggestion that would be helpful: How can i pitch a sound (not realtime) ? Im using AVFoundation framework to play my sounds like so: AVAudioPlayer *player = [[AVAudioPlayer alloc] initWithContentsOfURL:TempRecFile error:nil]; player.volume = 1; [player play]; How can i set the pitch or the frequency of my sound without having to use some othere frameworks like OpenAL. Although, if you know a place

iOS - Code only works when I insert a breakpoint

孤人 提交于 2019-12-08 12:31:01
问题 Update I'm starting to figure this out. The AVAudioRecorder session gets activated and I get mic level readings for a few seconds. Then the async video code completes, the camera view displays, and I stop getting readings. It seems like the video is killing the audio session. What's strange is that the code works on iOS 7 and won't work on iOS 6. Any ideas how to get around this. Is it a limitation on iOS 6? I'm getting sound levels via the mic and I can only get them when I place a

IPhone YUV channel orientation

我怕爱的太早我们不能终老 提交于 2019-12-08 09:50:59
问题 I am grabbing the YUV channel from the IPhone in the kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange format (YUV, bi-planar). I intend to process the y-channel, so I grab it using CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer( sampleBuffer ); CVPixelBufferLockBaseAddress( pixelBuffer, 0 ); int bufferHeight = CVPixelBufferGetHeight(pixelBuffer); int bufferWidth = CVPixelBufferGetWidth(pixelBuffer); uint8_t *y_channel = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0); The

Core Data Error - NSDate localizedCaseInsensitiveCompare: unrecognised selector sent to instance

人盡茶涼 提交于 2019-12-08 09:07:34
问题 I have searched for the last several hours but have yet to find an answer. I implemented an AVFoundation camera, Im saving the image data to disk and storing only the path in core data. Everything works fine but after a random number of taken photos I get this error: CoreData: error: Serious application error. Exception was caught during Core Data change processing. This is usually a bug within an observer of NSManagedObjectContextObjectsDidChangeNotification. -[__NSDate

AVAssetExportSession not working in ios5

混江龙づ霸主 提交于 2019-12-08 08:43:21
问题 In my application I am combining two audio files using AVAssetExportSession and it works fine in earlier ios versions.But in ios5 device its not working. What i am getting is an error AVAssetExportSessionStatusFailed: Error Domain=AVFoundationErrorDomain Code=-11820 "Cannot Complete Export" UserInfo=0x1df1c0 {NSLocalizedRecoverySuggestion=Try exporting again., NSLocalizedDescription=Cannot Complete Export} The code that I use for exporting is given below Did anyone experience the same issue?

Detect current Keyframe interval in AVAsset

筅森魡賤 提交于 2019-12-08 08:41:24
问题 I am working on an application that plays back video and allows the user to scrub forwards and backwards in the video. The scrubbing has to happen smoothly, so we always re-write the video with SDAVAssetExportSession with the video compression property AVVideoMaxKeyFrameIntervalKey:@1 so that each frame will be a keyframe and allow smooth reverse scrubbing. This works great and provides smooth playback. The application uses video from a variety of sources and can be recorded on android or iOS

Panning a mono signal with MultiChannelMixer & MTAudioProcessingTap

血红的双手。 提交于 2019-12-08 07:44:08
问题 I'm looking to pan a mono signal using MTAudioProcessingTap and a Multichannel Mixer audio unit, but am getting a mono output instead of a panned, stereo output. The documentation states: "The Multichannel Mixer unit (subtype kAudioUnitSubType_MultiChannelMixer) takes any number of mono or stereo streams and combines them into a single stereo output." So, the mono output was unexpected. Any way around this? I ran a stereo signal through the exact same code and everything worked great: stereo

What's the best way to composite frame-based animated stickers over recorded video?

╄→гoц情女王★ 提交于 2019-12-08 06:45:22
问题 We want to allow the user to place animated "stickers" over video that they record in the app and are considering different ways to composite these stickers. Create a video in code from the frame-based animated stickers (which can be rotated, and have translations applied to them) using AVAssetWriter . The problem is that AVAssetWriter only writes to a file and doesn't keep transparency. This would prevent us from being able to overly it over the video using AVMutableComposition . Create .mov

Objective-C How To Add Overlay To Video Using AVFoundation?

蹲街弑〆低调 提交于 2019-12-08 06:42:28
I've been trying to add overlay to a video for days now, and I just can't figure out why when I save, the video gets rotated ninety degrees to the left. I've been trying to fix it but this is as close to it as I have gotten. At the moment the session preset is at AVCaptureSessionPreset1920x1080. When I rotate the video it shrinks, and I have to translate it to the center. I can't manage to get the video to be full screen after I rotate it. Please someone help I really really need it. I'll do anything! AVURLAsset *videoAsset = [[AVURLAsset alloc] initWithURL:self.video options:nil];