avfoundation

repeat AVPlayerItem in AVQueuePlayer

ぐ巨炮叔叔 提交于 2019-12-07 09:41:10
问题 I am using AVQueuePlayer to play my list of videos. I want to play one video continuously unless I call for second video to play. Now, Video1 plays and when it ends, calls for video2 to play which I don't like. Secondly, there is a delay in between two videos. Is there any way of smooth transition from 1 video to second one? 回答1: Regarding your first question: Set the AVQueuePlayer 's actionAtItemEnd property to AVPlayerActionAtItemEndNone . Then register for

AVCaptureSession Pause and Resume recording

坚强是说给别人听的谎言 提交于 2019-12-07 09:20:04
问题 I am making a movie app for iOS 5.0 using AVCaptureSession. I am giving the user ability to start-pause-start-stop recording a movie The three buttons that I have defined are Start Recording Stop Recording Pause Recording I am able to successfully start & stop a recording. What I am unable to do is pause a recording and then resume it again. I looked at this question/answer on stack overflow but I have no idea how are they pausing and resuming the video? I did find some other posts here but

Add chapter information to existing video with AVFoundation

左心房为你撑大大i 提交于 2019-12-07 09:15:35
I am trying to add chapter markers (text + images) to an existing video in iOS. Reading them is trivially easy with builtin functions: NSLocale * locale = [chapterLocalications lastObject]; NSLog(@"Locale: %@", [locale localeIdentifier]); NSArray *keys = @[AVMetadataCommonKeyTitle, AVMetadataCommonKeyArtwork]; NSArray *chapters = [asset chapterMetadataGroupsWithTitleLocale:locale containingItemsWithCommonKeys:keys]; for (AVTimedMetadataGroup * metadataGroup in chapters) { NSArray * items = metadataGroup.items; CMTimeRange timeRange = metadataGroup.timeRange; NSLog(@"time: %@",

iOS AVFoundation Video Capture Orientation Options

三世轮回 提交于 2019-12-07 09:01:30
问题 I have an app that I would like to have video capture for the front-facing camera only. That's no problem. But I would like the video capture to always be in landscape, even when the phone is being held in portrait. I have a working implementation based on the AVCamDemo code that Apple published. And borrowing from the information in this tech note, I am able to specify the orientation. There's just one trick: while the video frame is oriented correctly, the contents still appear as though

Memory Management issue with AVAssetWriter in iPhone?

北城以北 提交于 2019-12-07 08:08:36
问题 I have successfully created video from uiimages using AVAssetWriter. But as soon as the writer starts writing video theres a sudden rise in the memory allocation in the instruments. The spike in the memory allocation changes from 3-4 MB to 120MB and then cools off. I have used the following code for this... -(void)writeImageAsMovie:(NSArray *)array toPath:(NSString*)path size:(CGSize)size { NSMutableDictionary *attributes = [[NSMutableDictionary alloc]init]; [attributes setObject:[NSNumber

playing video in uitableview cell

☆樱花仙子☆ 提交于 2019-12-07 07:49:32
问题 I am trying to setup a UITableView that can play videos. Many of the previous SO questions on this used MPMoviePlayer(Playing Video into UITableView, Playing video in UItableView in SWIFT, Playing Video From UITableView), which is now deprecated in iOS 9. One of the few that used AVFoundation (what i'm using), is this one: Play video on UITableViewCell when it is completely visible and is where I'm getting most of my code from. Here is my code, inside cellForRowAtIndexPath: VideoCell

Ios rotate, filter video stream in ios

亡梦爱人 提交于 2019-12-07 07:20:48
问题 Hello There I am rotating and applying image filters by GPUImage on vide live stream The task is consuming more time than expected resulting over-heating of iPhone Can anybody help me out in optimising my code Following is my used code: - (void)willOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer{ //return if invalid sample buffer if (!CMSampleBufferIsValid(sampleBuffer)) { return; } //Get CG Image from sample buffer CGImageRef cgImageFromBuffer = [self cgImageFromSampleBuffer:sampleBuffer]

Take a screenshot of an UIView where its subviews are camera sessions

梦想的初衷 提交于 2019-12-07 06:46:15
问题 I'm building an app where I need to take a screenshot of a view whose subviews are camera sessions (AVFoundation sessions). I've tried this code: CGRect rect = [self.containerView bounds]; UIGraphicsBeginImageContextWithOptions(rect.size,YES,0.0f); CGContextRef context = UIGraphicsGetCurrentContext(); [self.containerView.layer renderInContext:context]; UIImage *capturedImage = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); Which effectively gets me an UIImage with

Can’t observe AVPlayerItem for @“status” key

倖福魔咒の 提交于 2019-12-07 06:19:50
问题 I’m trying to play stream from URL, but here is an issue. ObserveValueForKeyPath:ofObject:change:context just doesn’t execute. As I understand, it is not depends on streamURL, nether it is correct or not. It must change status to AVPlayerItemStatusReadyToPlay or AVPlayerItemStatusFailed from AVPlayerItemStatusUnknown. But nonetheless I checked that URL in browser, and it is working fine. So, what’s the problem? I watched WWDC video about it, and it says, if you have audio stream, use

iOS- how to get the duration of .mp4 file by using AVAsset or AVURLAsset

試著忘記壹切 提交于 2019-12-07 06:14:01
问题 i know duration of video type questions have been answered before but i am facing real trouble in getting duration of an .mp4 file by using AVAsset and by AVURLAsset . i am using Following code NSString *itemPathString = [[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)objectAtIndex:0] stringByAppendingPathComponent:obmodel.actualname]; NSURL *itemPathURL = [NSURL URLWithString:itemPathString]; if([[NSFileManager defaultManager] fileExistsAtPath:itemPathString]