avfoundation

AVAudioPlayer not playing audio in Swift

旧巷老猫 提交于 2019-11-29 02:04:56
I have this code in a very simple, single view Swift application in my ViewController : var audioPlayer = AVAudioPlayer() @IBAction func playMyFile(sender: AnyObject) { let fileString = NSBundle.mainBundle().pathForResource("audioFile", ofType: "m4a") let url = NSURL(fileURLWithPath: fileString) var error : NSError? audioPlayer = AVAudioPlayer(contentsOfURL: url, error: &error) audioPlayer.delegate = self audioPlayer.prepareToPlay() if (audioPlayer.isEqual(nil)) { println("There was an error: (er)") } else { audioPlayer.play() NSLog("working") } I have added import AVFoundation and audioPlayer

Getting mp3 artwork crashes on iOS 8 but works on iOS 7

孤者浪人 提交于 2019-11-29 02:04:07
EDIT: The culprit was iOS 8, not the simulator (which I didn't realize was already running iOS 8) I've renamed the title to reflect this. I was happily using the code from this SO question to load album artwork from mp3 files. This was on my iPhone 5 with iOS 7.1. But then I traced crashing in the iOS simulator to this code. Further investigation revealed that this code also crashed on my iPad. It crashed on my iPad after upgrading it to iOS 8. It appears the dictionary containing the image is corrupted. I created a dummy iOS project that only loads album art and got the same result. Below is

Video Recording using AVFoundation Framework iPhone?

跟風遠走 提交于 2019-11-29 01:55:22
问题 I'm developing an application with the help of sample code from the WWDC 2010 AVCamDemo example. In the app I need to record a video from the front camera of iPhone, but since the new iPhone 4 is not available at my place I am not able to test the code properly. I would be really thankful if someone can give me a heads up whether I'm going in the right direction or not. The limited code I could test on my iPhone 3G (upgraded to iOS 4.1) crashes when I set the AVCaptureSession , as shown in

Photo capture permission problems in iOS 11

倾然丶 夕夏残阳落幕 提交于 2019-11-29 01:23:17
So here's my problem. I am trying to create a screen in which there is a UIImageView and a UIButton. When the user presses the button, the camera app opens, you take a photo and if you press "Use Photo" in the Camera app, you are returned to my app's screen and the photo is placed in the UIImageView I mentioned previously. What happens so far is that when I press the "Use Photo" button, the image is correctly placed in my UIImageView but then the app crashes with the following error: This app has crashed because it attempted to access privacy-sensitive data without a usage description. The app

AVAssetWriterInput H.264 Passthrough to QuickTime (.mov) - Passing in SPS/PPS to create avcC atom?

旧时模样 提交于 2019-11-29 01:16:11
问题 I have a stream of H.264/AVC NALs consisting of types 1 (P frame), 5 (I frame), 7 (SPS), and 8 (PPS). I want to write them into an .mov file without re-encoding. I'm attempting to use AVAssetWriter to do this. The documentation for AVAssetWriterInput states: Passing nil for outputSettings instructs the input to pass through appended samples, doing no processing before they are written to the output file. This is useful if, for example, you are appending buffers that are already in a desirable

AVAudioEngine playing multi channel audio

大兔子大兔子 提交于 2019-11-29 00:47:09
Simple question. How do I play multi channel audio files (>2 channels) using AVAudioEngine so that I can hear all channels on default 2-channel output (headphones/speaker). Following code (stripped of error checking for presenting) plays the file first two channels but I can only hear it when headphones are plugged in. AVAudioFile *file = [[AVAudioFile alloc] initForReading:[[NSBundle mainBundle] URLForResource:@"nums6ch" withExtension:@"wav"] error:nil]; AVAudioEngine *engine = [[AVAudioEngine alloc] init]; AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init]; AVAudioMixerNode *mixer

HTTP Live Streaming with AVPlayer in iOS 4.0?

大憨熊 提交于 2019-11-29 00:38:18
Is it possible to use HTTP Live Streaming with AVPlayer on iOS 4.0? This was clearly a documented feature of 4.0. However, if I run Apple's SitchedStreamPlayer sample code on my 3GS running iOS 4.0.1, clicking "Load Movie" does not play the stream, but gives an error: 2011-06-21 13:14:49.428 StitchedStreamPlayer[680:307] The asset's tracks were not loaded due to an error: Cannot Open MPMediaPlayer is able to play the same stream on the same device. However, I need a working solution with AVPlayer. Does anyone know how to get Apple's StichedStreamPlayer code to work on 4.0? The Runtime

Change camera capture device while recording a video

两盒软妹~` 提交于 2019-11-29 00:37:51
I am developing an iPhone App. In that, there is a requirement for Pausing and resuming the camera. So i used AVFoundation for that instead of using UIImagePickerController . My code is : - (void) startup :(BOOL)isFrontCamera { if (_session == nil) { NSLog(@"Starting up server"); self.isCapturing = NO; self.isPaused = NO; _currentFile = 0; _discont = NO; // create capture device with video input _session = [[AVCaptureSession alloc] init]; AVCaptureDevice *cameraDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; if(isFrontCamera) { NSArray *videoDevices = [AVCaptureDevice

iOS: AVPlayer - getting a snapshot of the current frame of a video

久未见 提交于 2019-11-28 23:50:21
I have spent the whole day and went through a lot of SO answers, Apple references, documentations, etc, but no success. I want a simple thing: I am playing a video using AVPlayer and I want to pause it and get the current frame as UIImage . That's it. My video is a m3u8 file located on the internet, it is played normally in the AVPlayerLayer without any problems. What have I tried: AVAssetImageGenerator . It is not working, the method copyCGImageAtTime:actualTime: error: returns null image ref. According to the answer here AVAssetImageGenerator doesn't work for streaming videos. Taking

Get Camera Preview to AVCaptureVideoPreviewLayer

我只是一个虾纸丫 提交于 2019-11-28 23:43:58
I was trying to get the camera input to show on a preview layer view. self.cameraPreviewView is tied to a UIView in IB Here is my current code that I put together from the AV Foundation Programming Guide. But the preview never shows AVCaptureSession *session = [[AVCaptureSession alloc] init]; session.sessionPreset = AVCaptureSessionPresetHigh; AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; NSError *error = nil; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; if (!input) { NSLog(@"Couldn't create video