avasset

objective c - AvAssetReader and Writer to overlay video

时光毁灭记忆、已成空白 提交于 2020-01-06 17:26:24
问题 I am trying to overlay a recorded video with AvAssetReader and AvAssetWriter with some images. Following this tutorial, I am able to copy a video (and audio) into a new file. Now my objective is to overlay some of the initial video frames with some images with this code: while ([assetWriterVideoInput isReadyForMoreMediaData] && !completedOrFailed) { // Get the next video sample buffer, and append it to the output file. CMSampleBufferRef sampleBuffer = [assetReaderVideoOutput

AVExportSession to run in background

末鹿安然 提交于 2020-01-04 02:39:12
问题 I am working on one application in which it requires to merge more than one videos. I am using AVExportSession to export merged video. I am also displaying progress bar for exporting video. It is running correctly most of times. The issue occurs when we lock the screen or put application in background mode. This time if exporting is in process, it immediately fails after putting application in background mode. I have also tried to use background task. Check below code. bgTask = [

AVFoundation -Videos merge but only last video plays

瘦欲@ 提交于 2020-01-03 17:28:00
问题 I have an array of [AVAsset]() . Whenever I record different videos at different durations the below code merges all the durations into 1 video but it will only play the last video in a loop. For eg. video1 is 1 minute and shows a dog walking, video2 is 1 minute and shows a bird flying, video3 is 1 minute and shows a horse running. The video will merge and play for 3 minutes but it will only show the horse running for 1 minute each three consecutive times. Where am I going wrong at? var

AVFoundation -Videos merge but only last video plays

匆匆过客 提交于 2020-01-03 17:27:33
问题 I have an array of [AVAsset]() . Whenever I record different videos at different durations the below code merges all the durations into 1 video but it will only play the last video in a loop. For eg. video1 is 1 minute and shows a dog walking, video2 is 1 minute and shows a bird flying, video3 is 1 minute and shows a horse running. The video will merge and play for 3 minutes but it will only show the horse running for 1 minute each three consecutive times. Where am I going wrong at? var

iOS rotate video AVAsset avfoundation

谁都会走 提交于 2020-01-03 02:10:06
问题 Example Hi, Struggling to rotate this video to show in the proper orientation and fill the entire screen. I cannot get the avasset with videocompisition but cannot get it to work correctly. let videoAsset: AVAsset = AVAsset(URL: outputFileURL) as AVAsset let clipVideoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo).first! as AVAssetTrack let newHeight = CGFloat(clipVideoTrack.naturalSize.height/3*4) let composition = AVMutableComposition() composition.addMutableTrackWithMediaType

Possible for AVAssetWriter to write files with transparency?

前提是你 提交于 2019-12-30 11:23:09
问题 Every file I write with AVAssetWriter has a black background, if the images I include do not fill the entire render area. Is there any way to write with transparency? Here's the method I use to get the pixel buffer: - (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image { CGSize size = self.renderSize; NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, [NSNumber numberWithBool:YES],

Any way to get the resource name of an existing AVAsset?

。_饼干妹妹 提交于 2019-12-24 05:45:17
问题 I am interested in obtaining the resource name (aka filename) of an AVAsset , specifically an mp4 video. The asset was created with the resource name, I simply want to obtain that same string resource in the future when I have only the AVAsset instance. In this case how can I obtain "my-video" from the asset? AVAsset(URL: NSBundle.mainBundle().URLForResource("my-video", withExtension: "mp4")!) I am able to obtain the asset's title , but this is not always the same as the resource name. And

How to detect if video is Landscape/portrait when fetched from PHAsset?

空扰寡人 提交于 2019-12-24 00:56:31
问题 I am fetching videos from PHAsset so user can choose the video and import to perform edit. But user should be only able to select video with Landscape orientation, if user select portrait video, she/he would get alert message saying that its portrait video and hence can not import to edit. One way to do this is creating AVAsset from URL of PHFetchResults, and then checking a preferedTransform, but that would be very costly operation to do right? Is there a way to do this without creating

AVFoundation decode prores4444 movie with alpha channel

六眼飞鱼酱① 提交于 2019-12-23 01:27:11
问题 I'm trying to decode a prores4444 video with alpha channel on iOS with Swift to overlay as a complex animation over a user video and to export it to his library. The AVFoundation documentation is not that great and I'm struggling to find any code examples. When I try to use the code below with AVAssetReaderTrackOutput to decode the video I get an "AVAssetReaderOutput does not currently support compressed output" error. let avAssetReaderVideoCompositionOutput =

Processing all frames in an AVAsset

泄露秘密 提交于 2019-12-22 10:30:29
问题 I am trying to go through each frame in an AVAsset and process each frame as if it were an image. I have not been able to find anything from my searches. The task I am trying to accomplish would look like this in pseudo-code for each frame in asset take the frame as an image and convert to a cvMat Process and store data of center points Store center points in array The only part in that pseudo-code I do not know how to write is the going though each frame and capturing it in an image. Can