avfoundation

AVPlayer Dynamic Volume control

*爱你&永不变心* 提交于 2019-12-03 20:53:25
How can I change the volume of the AVPlayer Dynamically? I mean, I want to mute the volume every time a button is pressed. the given code seems to change it in compile time only. How to do it during runtime??? AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[self myAssetURL] options:nil]; NSArray *audioTracks = [asset tracksWithMediaType:AVMediaTypeAudio]; NSMutableArray *allAudioParams = [NSMutableArray array]; for (AVAssetTrack *track in audioTracks) { AVMutableAudioMixInputParameters *audioInputParams =[AVMutableAudioMixInputParameters audioMixInputParameters]; [audioInputParams setVolume:0

Play background music in app?

拟墨画扇 提交于 2019-12-03 20:50:41
问题 I want the user to be able to open the app and have music start playing. I want the user to be able to go any view controller and return back to the initial one without the music stopping. I want it to loop indefinitely. I have tried to put in the viewDidLoad method of the initial view controller for it to start playing. What happens is, the user leaves the initial view controller and when they come back, the music starts playing again, overlapping the original copy. To remedy this, I put an

iPhone trim audio recording

萝らか妹 提交于 2019-12-03 20:48:32
I have a voice memo component in my app, and I want to allow the user to trim the audio, similar to QuickTime X on Mac OS Ten point Six handles it, or like the Voice Memos app on the iPhone. Here's an example of both: Any help is appreciated. I am not a UI programmer by any means. This was a test I wrote to see how to write custom controls. This code may or may not work. I have not touched it in some time. header @interface SUIMaxSlider : UIControl { @private float_t minimumValue; float_t maximumValue; float_t value; CGPoint trackPoint; } @property (nonatomic, assign) float_t minimumValue,

Creating a AVAsset with a HTTP NSURL

ε祈祈猫儿з 提交于 2019-12-03 20:40:28
I'm trying to merge two NSURLs that contain video references. One of the urls point to a video on AWS and the other points to a video that is stored locally. My exporting code works because I've tried it with two local videos, but whenever I try merge the HTTP url and the local url I get this error: Error Domain=NSURLErrorDomain Code=-1100 "The requested URL was not found on this server." UserInfo=0x155d2f20 {NSUnderlyingError=0x155b4f60 "The operation couldn’t be completed. No such file or directory", NSLocalizedDescription=The requested URL was not found on this server.} This is the code to

iPhone 4 AVFoundation : Capture from front and rear cameras simultaneously

别说谁变了你拦得住时间么 提交于 2019-12-03 20:05:23
问题 I was wondering if it was possible to capture from both cameras simultaneously using AVFoundation framework. Specifically, my question is whether both front and rear AVCaptureDevices can be active at the same time or not. Currently I know that an AVCaptureSession instance can support only one input (and output). I create two AVCaptureSessions, attach front camera device to one and rear to other, I then point the outputs of the sessions to different SampleBufferDelegate functions. What I see

Save sampleBuffer in array (AVFoundation)

感情迁移 提交于 2019-12-03 19:40:15
问题 I try to save the sample buffer instead of an UIImage to an array, to convert it later on. This to speed up the image capturing and maybe not get memory warnings. I just can't figure out how to save it to the array and then use it again to call [self imageFromSampleBuffer:sampleBuffer]. I tried something like this, but how do I convert the data back to a CMSampleBufferRef object? - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer

CALayer.contents not rendering correctly in AVMutableComposition

旧城冷巷雨未停 提交于 2019-12-03 19:28:44
问题 I have a very simple method that generates a video with a static background image which covers the entire video composition and a smaller, partially transparent image (watermark style) that's located at the bottom of the video. The background image renders correctly and appears exactly the same as it looks in an image viewer. However, the image that's supposed to be rendered at the bottom of the video is skewed/distorted. The source can be downloaded here, on GitHub. The expected output of my

Combining two .caf files on iPhone

限于喜欢 提交于 2019-12-03 18:43:56
问题 I've looked and looked for an answer, but can't seem to find one. Lots have asked, but none have gotten answers. I have an app that records audio using AVAudioRecorder. Now I just want to merge two or more recordings into one file that can be sent out via email. Does anyone have any clue as to how this can be done? (This answer suggests using something called Audio Service Queues, but I don't know anything about that) 回答1: It's not quite as easy as you would think. I used the AVFoundation

How to convert AudioBufferList to CMSampleBuffer?

醉酒当歌 提交于 2019-12-03 18:05:25
问题 I have an AudioTapProcessor attached to AVPlayerItem. which will call static void tap_ProcessCallback(MTAudioProcessingTapRef tap, CMItemCount numberFrames, MTAudioProcessingTapFlags flags, AudioBufferList *bufferListInOut, CMItemCount *numberFramesOut, MTAudioProcessingTapFlags *flagsOut) when processing. I need to convert the AudioBufferList to CMSampleBuffer so I could use AVAssetWriterAudioInput.appendSampleBuffer to write it into a movie file. So how to convert AudioBufferList to

Photo capture permission problems in iOS 11

偶尔善良 提交于 2019-12-03 17:57:58
问题 So here's my problem. I am trying to create a screen in which there is a UIImageView and a UIButton. When the user presses the button, the camera app opens, you take a photo and if you press "Use Photo" in the Camera app, you are returned to my app's screen and the photo is placed in the UIImageView I mentioned previously. What happens so far is that when I press the "Use Photo" button, the image is correctly placed in my UIImageView but then the app crashes with the following error: This app