avfoundation

Reverse video playback in iOS

心已入冬 提交于 2019-11-29 03:51:30
I want to play video backward in AVPlayer. I have tried with changing rates property to -1.0, and although it did work it was not smooth. Is there any way through which I can smoothly play videos backward? As stated in the comments, the problem is with keyframes and the fact that most codecs are not designed to play backwards. There are 2 options for re-encoding the video that doesn't require you to actually reverse the video in editing. Make every frame a keyframe. I've seen this work well for codecs like H.264 that rely on keyframes. Basically if every frame is a key frame, then each frame

Avfoundation - Play and record video (along with audio and preview) simultaneously

旧街凉风 提交于 2019-11-29 03:46:34
问题 I am trying to record and play video simultaneously. Is this possible with avfoundation? Currently i am able to do it as long as i dont record audio. As soon as i add audio input to AVCaptureSession and restart the whole thing i receive "AVCaptureSessionWasInterruptedNotification" and recording stops. This is how i play video. MPMoviePlayerController *moviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL fileURLWithPath:path]]; [moviePlayer.view setFrame:self.playerView

I want to call 20 times per second the installTapOnBus:bufferSize:format:block:

偶尔善良 提交于 2019-11-29 02:56:42
问题 I want to waveform display in real-time input from the microphone. I have been implemented using the installTapOnBus:bufferSize:format:block:, This function is called three times in one second. I want to set this function to be called 20 times per second. Where can I set? AVAudioSession *audioSession = [AVAudioSession sharedInstance]; NSError* error = nil; if (audioSession.isInputAvailable) [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&error]; if(error){ return; }

Extracting ID3 tags from MP3 over HTTP Live Streaming

我怕爱的太早我们不能终老 提交于 2019-11-29 02:50:08
问题 I've been having quite a difficult time extracting ID3 information from an MP3 being streamed over Live HTTP Streaming (using the Wowza media server, if anyone is curious). I know that the tags (right now the album tag and the album artwork tag) are being properly embedded in each of the file segments because when I download them manually I can see them in each segment as listed in the .m3u index file generated by the server. I am using the AVFoundation classes to do this, and I have it setup

Determine the corners of a sheet of paper with iOS 5 AV Foundation and core-image in realtime

佐手、 提交于 2019-11-29 02:42:10
I am currently building a camera app prototype which should recognize sheets of paper lying on a table. The clue about this is that it should do the recognition in real time, so I capture the video stream of the camera, which in iOS 5 can easily be done with the AV foundation. I looked at here and here They are doing some basic object recognition there. I have found out that using OpenCV library in this realtime environment does not work in a performant way. So what I need is an algorithm to determine the edges of an image without OpenCV. Does anyone have some sample code snippets which lay

How to fix my orientation issue with merging videos from front and back camera

微笑、不失礼 提交于 2019-11-29 02:37:43
I am merging multiply videos (implantation of pause button) and everything working fine exept when merging video from back camera with video from front camera then one of the videos comes turned upside down in the new video(merged video). My code: let mixComposition = AVMutableComposition() let videoTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID()) let trackAudio = mixComposition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID()) var insertTime = kCMTimeZero for var i = 0; i < currentAssets.count;

Preventing AVCaptureVideoPreviewLayer from rotating, but allow UI layer to rotate with orientation

对着背影说爱祢 提交于 2019-11-29 02:34:40
I have two view controllers. One is the root VC and contains the UI interface such as the record button. On this view controller, I also display the view of another VC at index 0. This view contains a AVCaptureVideoPreviewLayer. I would like my video camera to mimic the Apple video camera app, where the interface layout adjusts with the rotation, but the video preview layer does not. You can see how the recording timer (UILabel) in the stock video app disappears and reappears at the top depending on the orientation. Any idea how to do this? I found one suggestion that recommendeds adding the

iOS AVFoundation - Converting video into images at 60 fps

我的未来我决定 提交于 2019-11-29 02:30:26
i'm trying to convert a whole video into a sequence of images at a rate of 60fps, which means 60 images generated per second of video... To do so, i'm making use of AVAssetImageGenerator and the generateCGImagesAsynchronouslyForTimes method ... Things go quite well except that i'm having serious performance issues regarging the batch processing execution time (approximately 5 mins for 13 seconds video) ... Moreover, above the following size CGSizeMake(512, 324), i experience crashes ... Did anyone already have experience with this kind of processing and knows how to reduce this time execution

Recording Audio and Video using AVFoundation frame by frame

女生的网名这么多〃 提交于 2019-11-29 02:26:19
How to record audio and video using AVFoundation frame by frame in iOS4? The AVCamDemo you mention is close to what you need to do and should be able to use that as reference, among those these are the following classes you need to use in order to achive what you are trying... All the classes are part of AVFoundation , you need AVCaptureVideoDataOutput and AVCaptutureAudioDataOutput - use these classes to get raw samples from the video camera and the microphone Use AVAssetWriter and AVAssetWriterInput in order to encode the raw samples into a file - the following sample mac OS X project shows

Determine Number of Frames in a Core Audio AudioBuffer

前提是你 提交于 2019-11-29 02:24:09
I am trying to access the raw data for an audio file on the iPhone/iPad. I have the following code which is a basic start down the path I need. However I am stumped at what to do once I have an AudioBuffer. AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:urlAsset error:nil]; AVAssetReaderTrackOutput *assetReaderOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:[[urlAsset tracks] objectAtIndex:0] outputSettings:nil]; [assetReader addOutput:assetReaderOutput]; [assetReader startReading]; CMSampleBufferRef ref; NSArray *outputs = assetReader.outputs;