avfoundation

Can AVFoundation be coerced into playing a local .ts file?

本小妞迷上赌 提交于 2019-12-03 06:08:13
问题 Clearly, AVFoundation (and Quicktime X) can demux and play properly encoded .ts containers, because .ts containers underly HTTPS live streaming. Short of setting up a local web service to serve the .m3u8 and associated .ts files, I'd really like to be able to either: convince AVURLAsset and/or URLAssetWithURL to accept a local file .m3u8 URI as if it were an HTTP URI, or better yet, be able to use AVQueuePlayer to load and play a sequence of .ts files without jumping through the live

AVFoundation, how to turn off the shutter sound when captureStillImageAsynchronouslyFromConnection?

喜欢而已 提交于 2019-12-03 06:06:58
问题 I am trying to capture an image during a live preview from the camera, by AVFoundation captureStillImageAsynchronouslyFromConnection. So far the program works as expected. However, how can I mute the shutter sound? 回答1: I used this code once to capture iOS default shutter sound (here is list of sound file names https://github.com/TUNER88/iOSSystemSoundsLibrary): NSString *path = @"/System/Library/Audio/UISounds/photoShutter.caf"; NSString *docs = [NSSearchPathForDirectoriesInDomains

AVAssetWriter unknown error

随声附和 提交于 2019-12-03 05:56:12
Am trying to create video from images using AVAssetWriter . Implemented code works fine most of time, but in random moments there is problem with writer AVAssetWriter *videoWriter; ... [videoWriter finishWriting]; NSLog(@"videoWriter error %@",videoWriter.error); Received error is: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x1f839cd0 {NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x1e59efb0 "The operation couldn’t be completed. (OSStatus error -12633.)", NSLocalizedFailureReason=An unknown error occurred (

Swift 3 sound play

瘦欲@ 提交于 2019-12-03 05:54:48
问题 Ok I have looked into this and have tried many different ways to play a sound when a button is clicked. How would I play a sound when a button is clicked in swift 3? I have my sound in a folder named Sounds and the name is ClickSound.mp3 回答1: User below this function //MARK:- PLAY SOUND func playSound() { let url = Bundle.main.url(forResource: "ClickSound", withExtension: "mp3")! do { player = try AVAudioPlayer(contentsOf: url) guard let player = player else { return } player.prepareToPlay()

Consecutive calls to startRecordingToOutputFileURL:

∥☆過路亽.° 提交于 2019-12-03 05:53:16
问题 The Apple docs seem to indicate that while recording video to a file, the app can change the URL on the fly with no problem. But I'm seeing a problem. When I try this, the recording delegate gets called with an error... The operation couldn‚Äôt be completed. (OSStatus error -12780.) Info dictionary is: { AVErrorRecordingSuccessfullyFinishedKey = 0; } (funky single quote in "couldn't" comes from logging [error localizedDescription]) Here's the code, which is basically tweaks to WWDC10 AVCam

Set maximum frame rate with AVFoundation in iOS 5

柔情痞子 提交于 2019-12-03 05:51:23
问题 I believe this used to be done with captureOutput.minFrameDuration . However, this is deprecated in iOS 5. Instead I apparently need to use AVCaptureConnection 's video.minFrameDuration . So I have my input, my output, I add them both the the capture session - where can I get access to the capture connection? I think it is created for me by the session, but where? I could try adding the I/O using addInputWithNoConnections and addOutputWithNoConnections and then maybe creating the connection

Play socket-streamed h.264 movie on iOS using AVFoundation

与世无争的帅哥 提交于 2019-12-03 05:39:12
I’m working on a small iPhone app which is streaming movie content over a network connection using regular sockets. The video is in H.264 format. I’m however having difficulties with playing/decoding the data. I’ve been considering using FFMPEG , but the license makes it unsuitable for the project. I’ve been looking into Apple’s AVFoundation framework ( AVPlayer in particular), which seems to be able to handle h264 content, however I’m only able to find methods to initiate the movie using an url – not by proving a memory buffer streamed from the network. I’ve been doing some tests to make this

AVAssetWriterInputPixelBufferAdaptor and CMTime

陌路散爱 提交于 2019-12-03 05:32:21
I'm writing some frames to video with AVAssetWriterInputPixelBufferAdaptor , and the behavior w.r.t. time isn't what I'd expect. If I write just one frame: [videoWriter startSessionAtSourceTime:kCMTimeZero]; [adaptor appendPixelBuffer:pxBuffer withPresentationTime:kCMTimeZero]; this gets me a video of length zero, which is what I expect. But if I go on to add a second frame: // 3000/600 = 5 sec, right? CMTime nextFrame = CMTimeMake(3000, 600); [adaptor appendPixelBuffer:pxBuffer withPresentationTime:nextFrame]; I get ten seconds of video, where I'm expecting five. What's going on here? Does

AVAudioPlayer stop a sound and play it from the beginning

China☆狼群 提交于 2019-12-03 05:29:07
I used the AVAudioPlayer to play a 10 sec wav file and it works fine. Now I what to stop the wav at the 4th sec and then play it again from the very 1st sec. Here is the code I tried: NSString *ahhhPath = [[NSBundle mainBundle] pathForResource:@"Ahhh" ofType:@"wav"]; AVAudioPlayer *ahhhhhSound =[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:ahhhPath] error:NULL]; [ahhhhhSound stop]; [ahhhhhSound play]; What I get is, the wav stops at the 4th sec but when I run the [XXX play] again, the wav continues to play the 5th sec instead of playing from the beginning. How could I get

Reading video frame-by-frame under iOS

只愿长相守 提交于 2019-12-03 05:21:27
问题 I'm looking for a way to retrieve the individual frames of a video using iOS API. I tried using AVAssetImageGenerator but it seems to only provide frame to the nearest second which is a bit too rough for my usage. From what I understand of the documentation, a pipeline of AVAssetReader, AVAssetReaderOutput and CMSampleBufferGetImageBuffer I should be able to do something but I'm stuck with a CVImageBufferRef. With this I'm looking for a way to get a CGImageRef or a UIImage but haven't found