avfoundation

iOS - Playback of recorded audio fails with OSStatus error -43 (file not found)

泄露秘密 提交于 2019-11-30 15:14:40
I set up an AVAudioRecorder instance the following way when my view loads: AVAudioSession *audioSession = [AVAudioSession sharedInstance]; audioSession.delegate = self; [audioSession setActive:YES error:nil]; [audioSession setCategory:AVAudioSessionCategoryRecord error:nil]; NSString *tempDir = NSTemporaryDirectory(); NSString *soundFilePath = [tempDir stringByAppendingPathComponent:@"sound.m4a"]; NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath]; NSLog(@"%@", soundFileURL); NSDictionary *recordSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt

AVAudioPlayer and AirPlay - possible?

折月煮酒 提交于 2019-11-30 15:09:24
I'm trying to ascertain whether it's possible to toggle AirPlay support using the AVAudioPlayer class. From what I have read: AirPlay is a technology that lets your application stream audio to Apple TV and to third-party AirPlay speakers and receivers. AirPlay support is built in to the AV Foundation framework and the Core Audio family of frameworks. Any audio content you play using these frameworks is automatically made eligible for AirPlay distribution. Once the user chooses to play your audio using AirPlay, it is routed automatically by the system. [ Ref ] Based on this info; it should work

Start playing sound in background on iOS?

随声附和 提交于 2019-11-30 14:53:05
I'm trying to do something similar to Tile.app . When it shows a notification, it plays a sound. That seems simple enough-- use UILocalNotification and include the sound file name. But local notifications limit sounds to no more than 30 seconds, and Tile's sound keeps playing for a whole lot longer than that. I expect it's looping, and it continues until I can't stand the noise any more. The sound also plays even if the phone's mute switch is on, which doesn't happen with local notifications. For these reasons, UILocalNotification appears to be out. I thought maybe I could post a text-only

AVAudioRecorder Won't Record On Device

折月煮酒 提交于 2019-11-30 14:51:20
This is my method: -(void) playOrRecord:(UIButton *)sender { if (playBool == YES) { NSError *error = nil; NSString *filePath = [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat: @"%d", [sender tag]] ofType:@"caf"]; NSURL *fileUrl = [NSURL fileURLWithPath:filePath]; AVAudioPlayer *player = [[AVAudioPlayer alloc] initWithContentsOfURL:fileUrl error:&error]; [player setNumberOfLoops:0]; [player play]; } else if (playBool == NO) { if ([recorder isRecording]) { [recorder stop]; [nowRecording setImage:[UIImage imageNamed:@"NormalNormal.png"] forState:UIControlStateNormal];

How do I set the orientation for a frame-by-frame-generated video using AVFoundation?

白昼怎懂夜的黑 提交于 2019-11-30 14:22:29
问题 I am writing an iPhone app which takes video from the camera, runs it through some OpenGL shader code and then writes the output to a video file using AVFoundation . The app runs in lanscape orientation (either) and therefore all video recorded should be landscape. The current code I use before starting recording to get the video the right way round is: [[self videoWriterInput] setTransform:CGAffineTransformScale(CGAffineTransformMakeRotation(M_PI), -1.0, 1.0)]; where videoWriterInput is an

Problem in writing metadata to image

前提是你 提交于 2019-11-30 14:21:46
问题 I am using AvFoundation to take still image and adding gps info to metadata and saving to a photo album using Asset library but gps info is not saving at all. here is my code... [self.stillImageTaker captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) { if (imageDataSampleBuffer != NULL) { CFDictionaryRef exifAttachments = CMGetAttachment(imageDataSampleBuffer,kCGImagePropertyExifDictionary, NULL);

AVFoundation Metadata Object Types

╄→гoц情女王★ 提交于 2019-11-30 14:19:15
I'm trying to use the AVFoundation to read barcodes with the below code, but I keep getting the error below. Help as to why would be much appreciated. Thanks in advance! //Create camera view session = AVCaptureSession() var layer = self.cameraView.layer vidLayer = AVCaptureVideoPreviewLayer.layerWithSession(session) as AVCaptureVideoPreviewLayer vidLayer.frame = self.cameraView.bounds vidLayer.videoGravity = AVLayerVideoGravityResizeAspectFill var device = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo) var error:NSError? = nil var input:AVCaptureDeviceInput? =

CGBitmapContextCreateImage error

匆匆过客 提交于 2019-11-30 14:13:01
I am getting error like this In my console: : CGBitmapContextCreate: invalid data bytes/row: should be at least 1920 for 8 integer bits/component, 3 components, kCGImageAlphaPremultipliedFirst. : CGBitmapContextCreateImage: invalid context 0x0 I use below code: - (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer { // Get a CMSampleBuffer's Core Video image buffer for the media data CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); // Lock the base address of the pixel buffer CVPixelBufferLockBaseAddress(imageBuffer, 0); // Get the number of bytes per

AVCaptureSession with multiple Outputs?

别来无恙 提交于 2019-11-30 13:42:56
问题 I'm currently developing an iOS app that applies CoreImage to the camera feed in order to take photos and videos, and I've run into a bit of snag. Up till now I've been using AVCaptureVideoDataOutput to obtain the sample buffers and manipulate them with CoreImage, and then displayed a simple preview, as well as using it to capture photos and saving them. When I tried to implement Video Recording, by writing the SampleBuffers to a video as I received them from the AVCaptureVideoDataOutput , it

How can I specify the format of AVAudioEngine Mic-Input?

橙三吉。 提交于 2019-11-30 13:10:43
I'd like to record the some audio using AVAudioEngine and the users Microphone. I already have a working sample, but just can't figure out how to specify the format of the output that I want... My requirement would be that I need the AVAudioPCMBuffer as I speak which it currently does... Would I need to add a seperate node that does some transcoding? I can't find much documentation/samples on that problem... And I am also a noob when it comes to Audio-Stuff. I know that I want NSData containing PCM-16bit with a max sample-rate of 16000 (8000 would be better) Here's my working sample: private