We want to allow the user to place animated "stickers" over video that they record in the app and are considering different ways to composite these stickers.
Create a video in code from the frame-based animated stickers (which can be rotated, and have translations applied to them) using
AVAssetWriter. The problem is thatAVAssetWriteronly writes to a file and doesn't keep transparency. This would prevent us from being able to overly it over the video usingAVMutableComposition.Create
.movfiles ahead of time for our frame based stickers and composite them usingAVMutableCompositionandlayerinstructions withtransformations. The problem with this is that there are no tools for easily converting ourPNGbased frames to a.movwhile maintaining an alpha channel and we'd have to write our own.Creating separate
CALayersfor each frame in the sticker animations. This could potentially create a very large number of layers per frame rate of the video.
Or any better ideas?
Thanks.
I would suggest that you take a look at my blog post on this specific subject. Basically, this example shows how RGBA video data can be loaded from a file attached to the app resources. This is imported from a .mov that contains Animation RGBA data on the desktop. A conversion step is required to get the data from the Desktop into iOS, since plain H.264 cannot support an Alpha channel directly (as you have discovered). Note that older hardware may have issues decoding a H.264 user recorded video and then another one on top of that, so this approach of using the CPU instead of the H.264 hardware for the sticker is actually better.
来源:https://stackoverflow.com/questions/38131274/whats-the-best-way-to-composite-frame-based-animated-stickers-over-recorded-vid