avfoundation

AVCaptureDevice videoZoomFactor always Out of Range

别说谁变了你拦得住时间么 提交于 2020-01-02 10:05:34
问题 I'm trying to set the zoom level of a camera by this code: AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; if ([videoDevice lockForConfiguration:nil]) { float newzoom=1.3; videoDevice.videoZoomFactor = newzoom; [videoDevice unlockForConfiguration]; } This code doesn't not works in ios 7(it works in ios 9), it cause always an exception: Terminating app due to uncaught exception 'NSRangeException', reason: 'videoZoomFactor out of range' I can't find

iPhone App - Show AVFoundation video on landscape mode

折月煮酒 提交于 2020-01-02 08:21:46
问题 I am using the AVCam example App from Apple. This example uses AVFoundation in order to show video on a view. I am trying to make from the AVCam a landscape App with no luck. When screen orientation changes the video is shown rotated on the view. Is there a way of handling this problem? 回答1: When you create your preview layer: captureVideoPreviewLayer.orientation = UIInterfaceOrientationLandscapeLeft; And the methods to manage rotations: -(void)willAnimateRotationToInterfaceOrientation:

how to add video on another video as overlay

让人想犯罪 __ 提交于 2020-01-02 07:41:26
问题 I am stuck on overlay video into another video, as I successfully add video on another video but not able to make scale same for both videos. taking reference from this. note: I have to add overlay video with transparency so below video visible. 来源: https://stackoverflow.com/questions/39825323/how-to-add-video-on-another-video-as-overlay

Output Video Size Huge Using HEVC Encoder on iOS

大憨熊 提交于 2020-01-02 07:30:12
问题 I have a project that currently uses the H.264 encoder to record video on iOS. I wanted to try using the new HEVC encoder in iOS 11 to reduce file sizes, but have found that using the HEVC encoder causes file sizes to balloon enormously. Here's a project on GitHub that shows the issue - it simultaneously writes frames from the camera to files using the H.264 and H.265 (HEVC) encoders, and the resulting file sizes are printed to the console. The AVFoundation classes are setup like this: class

How to convert Data of Int16 audio samples to array of float audio samples

烈酒焚心 提交于 2020-01-02 06:34:15
问题 I'm currently working with audio samples. I get them from AVAssetReader and have a CMSampleBuffer with something like this: guard let sampleBuffer = readerOutput.copyNextSampleBuffer() else { guard reader.status == .completed else { return nil } // Completed // samples is an array of Int16 let samples = sampleData.withUnsafeBytes { Array(UnsafeBufferPointer<Int16>( start: $0, count: sampleData.count / MemoryLayout<Int16>.size)) } // The only way I found to convert [Int16] -> [Float]... return

AVPlayer making extraneous http request prior to playback of HLS / AES-encrypted video

一个人想着一个人 提交于 2020-01-02 06:12:10
问题 We're using AVPlayer on iOS 8.4 to play HLS, AES-encrypted video. Our .m3u8 files include the url of the license server, e.g.: EXT-X-KEY:METHOD=AES-128,URI="https://...." In our iOS application, we're using AVAssetResourceLoaderDelegate method resourceLoader:shouldWaitForLoadingOfRequestedResource: to intercept the request that gets sent by the AVPlayer (or some object within AVFoundation) to the license server. Within that method, we add a token (required by the license server) to the

iOS - get programmatically queue of items currently playing

我与影子孤独终老i 提交于 2020-01-02 01:11:09
问题 I want to get programmatically queue currently played in native Music App. I can use MPMusicPlayerController to get currently playing item but I want to get not only the item but whole playing queue. Is it possible to do it using AVFoundation or any other library? 回答1: I'm afraid this is not possible. Apple does not give us access to this information from any libraries. 回答2: I'm pretty sure this is not possible through any public API. The Ecoute app that @sooper mentions must be using private

Record Streaming Audio from currently playing video in AVPlayer

别来无恙 提交于 2020-01-01 18:56:04
问题 Many similar questions but not Exactly the same. I currently have my code setup to play video via an AVPlayer. What I'm wanting to do is somehow extract the audio of the streaming video and eventually merge with the Mic Input (using AVAudioMixer?). (think Karaoke app) so when the user plays the recording it will play back the audio only from the video and the recording from the mic. I think AVAudioEngine is the way to go but i cannot for the life of me work it out. So my question is how do i

Record Streaming Audio from currently playing video in AVPlayer

核能气质少年 提交于 2020-01-01 18:55:11
问题 Many similar questions but not Exactly the same. I currently have my code setup to play video via an AVPlayer. What I'm wanting to do is somehow extract the audio of the streaming video and eventually merge with the Mic Input (using AVAudioMixer?). (think Karaoke app) so when the user plays the recording it will play back the audio only from the video and the recording from the mic. I think AVAudioEngine is the way to go but i cannot for the life of me work it out. So my question is how do i

Allow users background music in swift 2.0

喜你入骨 提交于 2020-01-01 09:04:03
问题 I am looking for some code to allow the user to play music from their phone while still using my app. Previously before swift 2.0 i would put this in the app delegate and it would work perfectly: AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryAmbient, error: nil) AVAudioSession.sharedInstance().setActive(true, error: nil) Does anyone know how to implement this in swift 2.0? 回答1: The following would be the syntax for Swift 2 calling setCategory and setActive on AVSession :