avassetreader

How to get the timestamp of each video frame in iOS while decoding a video.mp4

孤者浪人 提交于 2019-12-11 06:02:01
问题 Scenario: I am writing an iOS app to try decode a videoFile.mp4 . I am using AVAssetReaderTrackOutput with AVAssetReader to decode frames from the video file. This works very well. I get each & every frame from videoFile.mp4 basically using the following logic at the core. Code: AVAssetReader * videoFileReader; AVAssetReaderTrackOutput * assetReaderOutput = [videoFileReader.outputs objectAtIndex:0]; CMSampleBufferRef sampleBuffer = [assetReaderOutput copyNextSampleBuffer]; sampleBuffer is the

AVFoundation decode prores4444 movie with alpha channel

我怕爱的太早我们不能终老 提交于 2019-12-09 04:30:33
I'm trying to decode a prores4444 video with alpha channel on iOS with Swift to overlay as a complex animation over a user video and to export it to his library. The AVFoundation documentation is not that great and I'm struggling to find any code examples. When I try to use the code below with AVAssetReaderTrackOutput to decode the video I get an "AVAssetReaderOutput does not currently support compressed output" error. let avAssetReaderVideoCompositionOutput = AVAssetReaderVideoCompositionOutput(videoTracks: [videoOverlayAssetTrack], videoSettings: outputSettings: [AVVideoCodecKey:

How to correctly read decoded PCM samples on iOS using AVAssetReader — currently incorrect decoding

一个人想着一个人 提交于 2019-12-08 22:48:29
问题 I am currently working on an application as part of my Bachelor in Computer Science. The application will correlate data from the iPhone hardware (accelerometer, gps) and music that is being played. The project is still in its infancy, having worked on it for only 2 months. The moment that I am right now, and where I need help, is reading PCM samples from songs from the itunes library, and playing them back using and audio unit. Currently the implementation I would like working does the

.m4a raw data from iPod Library not playing

倾然丶 夕夏残阳落幕 提交于 2019-12-07 16:59:27
问题 So I am faced with a very weird and strange problem and was wondering if anyone else has come across this issue. I am grabbing the raw data from MPMediaItem from the phones music library and then sending it out via HTTP to be played elsewhere. Where my issue is arising is when I am grabbing the raw data from a file of type .m4a it seems to be missing pieces. For example if the original file that I check from itunes is 7.4mb what ill get from my code is of size 7.3mb. Ive done some research

How to control video frame rate with AVAssetReader and AVAssetWriter?

一世执手 提交于 2019-12-06 08:28:51
问题 We are trying to understand how to control/specify the frame rate for videos that we are encoding with AVAssetReader and AVAssetWriter . Specifically, we are are using AVAssetReader and AVAssetWriter to transcode/encode/compress a video that we have accessed from the photo/video gallery. We are able to control things like bit rate, aspect ratio changes, etc., but cannot figure out how to control the frame rate. To be specific, we'd like to be able to take as input a 30 FPS video that's 5

AVfoundation Reverse Video

僤鯓⒐⒋嵵緔 提交于 2019-12-06 00:05:14
问题 I tried to make video in reverse. While playing asset in AVPlayer i set the rate = -1 to make it work in reverse format. But how to export that video? I looked into docs. read about avassetwrite, sambuffers , compositions but didn't find any way to do this. Below links are refered by me http://www.raywenderlich.com/13418/how-to-play-record-edit-videos-in-ios Reverse video demo - https://github.com/mikaelhellqvist/ReverseClip Above example to no longer works in IOS 8. and even it is not

.m4a raw data from iPod Library not playing

我怕爱的太早我们不能终老 提交于 2019-12-05 21:58:45
So I am faced with a very weird and strange problem and was wondering if anyone else has come across this issue. I am grabbing the raw data from MPMediaItem from the phones music library and then sending it out via HTTP to be played elsewhere. Where my issue is arising is when I am grabbing the raw data from a file of type .m4a it seems to be missing pieces. For example if the original file that I check from itunes is 7.4mb what ill get from my code is of size 7.3mb. Ive done some research and found that a .m4a file is actually an encapsulation and I think I am not getting the encapsulation of

Get a particular frame by time value using AVAssetReader

六眼飞鱼酱① 提交于 2019-12-05 09:52:02
问题 I have checked this http://www.7twenty7.com/blog/2010/11/video-processing-with-av-foundation to get a video frame by frame. But my real requirement is to get a frame at particular time. I know that it should be possible by AVAssetReader , I wonder is there any direct method for this in AVAssetReader . Please give any guidance on how can I get frame at particular time. I checked the AVAssetImageGenerator but this is not the thing I really wanted. Finally I found the answer, you have to use

How to control video frame rate with AVAssetReader and AVAssetWriter?

一世执手 提交于 2019-12-04 15:56:51
We are trying to understand how to control/specify the frame rate for videos that we are encoding with AVAssetReader and AVAssetWriter . Specifically, we are are using AVAssetReader and AVAssetWriter to transcode/encode/compress a video that we have accessed from the photo/video gallery. We are able to control things like bit rate, aspect ratio changes, etc., but cannot figure out how to control the frame rate. To be specific, we'd like to be able to take as input a 30 FPS video that's 5 minutes long and emit a 5 minute video at 15 FPS. Our current loop that processes sample buffers is:

Why does CMSampleBufferGetImageBuffer return NULL

十年热恋 提交于 2019-12-04 06:10:26
I have built some code to process video files on OSX, frame by frame. The following is an extract from the code which builds OK, opens the file, locates the video track (only track) and starts reading CMSampleBuffers without problem. However each CMSampleBufferRef I obtain returns NULL when I try to extract the pixel buffer frame. There's no indication in iOS documentation as to why I could expect a NULL return value or how I could expect to fix the issue. It happens with all the videos on which I've tested it, regardless of capture source or CODEC. Any help greatly appreciated. NSString