avmutablecomposition

Swift Merge AVasset-Videos array

狂风中的少年 提交于 2020-01-22 12:57:10
问题 I want to merge the AVAsset- arrayVideos into one single video and save it on camera roll. Raywenderlich.com has a great tutorial where two videos are merged into one. I've created the following code, however the video that I get after exporting to camera roll includes only the first and the last video from the array (excluding the rest of the videos in the middle of arrayVideos ). Am I missing something here? var arrayVideos = [AVAsset]() //Videos Array var atTimeM: CMTime = CMTimeMake(0, 0)

Swift Merge AVasset-Videos array

▼魔方 西西 提交于 2020-01-22 12:57:09
问题 I want to merge the AVAsset- arrayVideos into one single video and save it on camera roll. Raywenderlich.com has a great tutorial where two videos are merged into one. I've created the following code, however the video that I get after exporting to camera roll includes only the first and the last video from the array (excluding the rest of the videos in the middle of arrayVideos ). Am I missing something here? var arrayVideos = [AVAsset]() //Videos Array var atTimeM: CMTime = CMTimeMake(0, 0)

Exporting a mirrored video with AVMutableComposition causes resizing issues

心已入冬 提交于 2020-01-04 06:14:56
问题 Everything works as expected if I turn off the mirroring on the front camera. However, if I turn it on, my final exported video has crucial resizing problems: This is how I currently manage the mirroring for my videos: if currentDevice == frontCamera { if let connection = output.connections.first { if connection.isVideoMirroringSupported { connection.automaticallyAdjustsVideoMirroring = false connection.isVideoMirrored = true //if true, this bug occurs. } } }else { //disabling photo mirroring

AVFoundation -Videos merge but only last video plays

瘦欲@ 提交于 2020-01-03 17:28:00
问题 I have an array of [AVAsset]() . Whenever I record different videos at different durations the below code merges all the durations into 1 video but it will only play the last video in a loop. For eg. video1 is 1 minute and shows a dog walking, video2 is 1 minute and shows a bird flying, video3 is 1 minute and shows a horse running. The video will merge and play for 3 minutes but it will only show the horse running for 1 minute each three consecutive times. Where am I going wrong at? var

AVFoundation -Videos merge but only last video plays

匆匆过客 提交于 2020-01-03 17:27:33
问题 I have an array of [AVAsset]() . Whenever I record different videos at different durations the below code merges all the durations into 1 video but it will only play the last video in a loop. For eg. video1 is 1 minute and shows a dog walking, video2 is 1 minute and shows a bird flying, video3 is 1 minute and shows a horse running. The video will merge and play for 3 minutes but it will only show the horse running for 1 minute each three consecutive times. Where am I going wrong at? var

CATextLayer doesn't appear in an AVMutableComposition when running from a unit test

微笑、不失礼 提交于 2020-01-03 07:14:09
问题 EDIT: The strangest thing: it seems that when running this code from a full app everything works, but I was always running the creation of the movie from my unit tests, and only there it didn't work. Trying to figure out why is that... I'm trying to combine video + audio + text using AVMutableComposition and export it to a new video. My code is based on the AVEditDemo from WWDC '10 I added a purple background to the CATextLayer so I can know for a fact it is exported to the movie, but no text

How do i pause video recording with iPhone SDK?

笑着哭i 提交于 2020-01-01 07:02:08
问题 I see there is an app called iFile with a pause feature while recording video. How do they do this? I tried using AVMutableComposition classes and when the user pauses i cut a new video and then merge the video at the end, however the processing time to merge the videos is not desirable. Can someone give me other good ideas on how to do this? I noticed the iFile way is very seamless. Thanks 回答1: Here are some ideas. I have not tried either of these. If you are using an AVAssetWriter to write

Overlaying image on video reduces video resolution

有些话、适合烂在心里 提交于 2019-12-24 20:33:03
问题 When I overlay an image on my video, the video quality is greatly reduced. If I don't set the video composition of the export session or set the export quality to passthrough, the video quality is great (but I get no overlays obviously). I'm passing in a local .mov video url to add the overlays too. I'm using PHPhotoLibrary to save the video to the camera roll. Using some other functions to transform the video and set its instructions. It all seems pretty straightforward, but something is

AVMutableComposition - Exporting wrong video transform

一曲冷凌霜 提交于 2019-12-22 05:16:30
问题 After exporting VideoAsset: the issues: Video orientation is not original transform Exported Video's layer seems to be always landscape . trying to: transform video layer orientation - rotate to original orientation video layer size - make it full screen size (by original orientation) some notes: videoAsset's CGRect is opposite from beginning. after export, video transform is wrong tried to rotate with no success for full size layer AVURLAsset*videoAsset = [[AVURLAsset alloc]initWithURL:url

Fixing orientation when stitching (merging) videos using AVMutableComposition

只愿长相守 提交于 2019-12-22 01:42:14
问题 TLDR - SEE EDIT I am creating a test app in Swift where I want to stitch multiple videos together from my apps documents directory using AVMutableComposition . I have had success in doing this to some degree, all my videos are stitched together and everything is showing the correct size portrait and landscape. My issue is, however, that all the videos are showing in the orientation of the last video in the compilation. I know that to fix this I will need to add layer instructions for each