avfoundation

AVAssetExportSession export fails non-deterministically with error: “Operation Stopped, NSLocalizedFailureReason=The video could not be composed.”

安稳与你 提交于 2019-11-28 23:18:22
We add subtitles to a video recorded by the user, but the export by our AVAssetExportSession object fails non-deterministically: sometimes it works, and sometimes it doesn't. It's unclear even how to reproduce the error. We noticed the asset tracks seem to get lost during export. Before exporting, there are two tracks (one for audio, one for video) as expected. But checking the number of tracks for the same file URL in exportDidFinish shows 0 tracks. So something seems wrong with the export process. Update: Commenting out exporter.videoComposition = mutableComposition fixes the error, but of

How to stop a video in AVPlayer?

你离开我真会死。 提交于 2019-11-28 23:11:38
I am using this code to play video file using avplayer how do I stop it [videoView setHidden:NO]; NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSString *path2 = [documentsDirectory stringByAppendingPathComponent:saveFileName]; NSURL *url1 = [[NSURL alloc] initFileURLWithPath: path2]; videoPlayer = [AVPlayer playerWithURL:url1] ; self.avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:videoPlayer]; //[self willAnimateRotationToInterfaceOrientation]; avPlayerLayer.frame = videoView

iOS 5.0 crash when reading data from an AVAssetReaderOutput

断了今生、忘了曾经 提交于 2019-11-28 22:09:27
I have this snippet of code used to read data from an AVAssetReaderOutput , the method works fine in iOS 4.0, however in 5.0 it crashes towards the end with bad access, not sure why, anyone have any input? AVAssetReaderOutput *output=[myOutputs objectAtIndex:0]; int totalBuff=0; while(TRUE) { CMSampleBufferRef ref=[output copyNextSampleBuffer]; if(ref==NULL) break; //copy data to file //read next one AudioBufferList audioBufferList; NSMutableData *data=[[NSMutableData alloc] init]; CMBlockBufferRef blockBuffer; CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(ref, NULL, &audioBufferList

iOS 10.0 - 10.1: AVPlayerLayer doesn't show video after using AVVideoCompositionCoreAnimationTool, only audio

给你一囗甜甜゛ 提交于 2019-11-28 21:52:09
Here is a complete project if you care to run this yourself: https://www.dropbox.com/s/5p384mogjzflvqk/AVPlayerLayerSoundOnlyBug_iOS10.zip?dl=0 This is a new problem on iOS 10, and it has been fixed as of iOS 10.2. After exporting a video using AVAssetExportSession and AVVideoCompositionCoreAnimationTool to composite a layer on top of the video during export, videos played in AVPlayerLayer fail to play. This doesn't seem to be caused by hitting the AV encode/decode pipeline limit because it often happens after a single export, which as far as I know only spins up 2 pipelines: 1 for the

Making video from UIImage array with different transition animations

谁说胖子不能爱 提交于 2019-11-28 21:34:48
问题 I am following this code to create Video from an UIImage Array . While transitioning from one image to another, there is no animation here. I want to add some photo transition effect like these : TransitionFlipFromTop TransitionFlipFromBottom TransitionFlipFromLeft TransitionFlipFromRight TransitionCurlUp TransitionCurlDown TransitionCrossDissolve FadeIn FadeOut These animations can be done via UIView.transition() & UIView.animate() . But how to apply these transition animations while making

Using AVMutableAudioMix to adjust volumes for tracks within asset

本小妞迷上赌 提交于 2019-11-28 20:57:13
I'm applying an AVMutableAudioMix to a asset I've created, the asset generally consists of 3-5 audio tracks (no video). The goal is to add several volume commands throughout the play time, ie I'd like to set the volume to 0.1 at 1 seconds, 0.5 at 2 seconds, then 0.1 or whatever at 3 seconds. I'm just now trying to do this with an AVPlayer but will also later use it when exporting the AVSession to a file. The problem is that it only seems to care about the first volume command, and seem to ignore all later volume commands. If the first command is to set the volume to 0.1, that will be the

How Do I Get Audio Controls on Lock Screen/Control Center from AVAudioPlayer in Swift

↘锁芯ラ 提交于 2019-11-28 20:36:08
New to iOS development, so here goes. I have an app that is playing audio - I'm using AVAudioPlayer to load single files by name in the app's assets. I don't want to query the user's library, only the files provided. Works great, but, I want the user to be able to pause and adjust volume from the lock screen. func initAudioPlayer(file:String, type:String){ let path = NSBundle.mainBundle().pathForResource(file, ofType: type)! let url = NSURL(fileURLWithPath: path) let audioShouldPlay = audioPlaying() do{ try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback) try

Writing video + generated audio to AVAssetWriterInput, audio stuttering

元气小坏坏 提交于 2019-11-28 19:58:46
I'm generating a video from a Unity app on iOS. I'm using iVidCap, which uses AVFoundation to do this. That side is all working fine. Essentially the video is rendered by using a texture render target and passing the frames to an Obj-C plugin. Now I need to add audio to the video. The audio is going to be sound effects that occur at specific times and maybe some background sound. The files being used are actually assets internal to the Unity app. I could potentially write these to phone storage and then generate an AVComposition, but my plan was to avoid this and composite the audio in

Implementing long running tasks in background IOS

谁都会走 提交于 2019-11-28 19:44:28
I have been working on an app in which user can record video using AVFoundation and send to the server, video has maximum size up to 15M, depending on the internet speed & type it can take from 1 to 5 minutes approximately to transfer video to the server. I am transferring the recorded video to the server in the background thread so that user can continue other stuff on the app while video is being uploaded to the server. While reading the Apple Docs for implementing long running tasks in backround , I see that only few kinds of apps are allowed to execute in the background. e.g. audio—The app

AVCaptureVideoPreviewLayer: taking a snapshot

喜夏-厌秋 提交于 2019-11-28 19:40:26
I'm trying to emulate the animation seen in the default camera app, where a snapshot of the cameras viewfinder is animated into the corner of the apps display. The AVCaptureVideoPreviewLayer object that holds the key to solving this problem isn't very open to these requirements: trying to create a copy of it in a new layer with .. - (id)initWithLayer:(id)layer .. returns an empty layer, without the image snapshot, so clearly there is some deeper magic going on here. Your clues/boos are most welcome. M. facing the same woes, from a slightly different angle. Here are possible solutions, that