avfoundation

How to mirror iOS screen via USB?

我的梦境 提交于 2019-12-03 14:10:40
问题 I'm trying to mirror iOS device screen via USB connection to OSX. QuickTime does this fine, and I read this article with a code example: https://nadavrub.wordpress.com/2015/07/06/macos-media-capture-using-coremediaio/ However, the callback of CMIOStreamCopyBufferQueue is never called and I'm wondering what am I doing wrong? Have anyone faced this issue and can provide a working example ? Thanks. 回答1: Well.. eventually I did what Nadav told me in his blog - discover DAL devices and capture

Fast switching between videos using AVFoundation

你说的曾经没有我的故事 提交于 2019-12-03 13:36:29
问题 I'm writing an application where the user can record up to 6 video clips each with a duration of 2 seconds. When the video clips are recorded the user can play with them using 6 buttons - one for each clip. The user can then record a movie by switching between the 6 clips. The problem is that I need near instantaneous switching between the 6 clips when the user presses a button - otherwise the illusion of playing with the clips is lost - the functionality is somewhat similar to the app called

iOS 5: Error merging 3 videos with AVAssetExportSession

依然范特西╮ 提交于 2019-12-03 13:33:18
I'm trying to merge (append) 3 videos using AVAssetExportSession, but I keep getting this error. Weirdly for 1 or 2 videos it worked. Error Domain=AVFoundationErrorDomain Code=-11820 "Cannot Complete Export" UserInfo=0x458120 {NSLocalizedRecoverySuggestion=Try exporting again., NSLocalizedDescription=Cannot Complete Export} I even tried to redo the function in case of error but what I got is only infinite error message. This is the snippet of my code. AVMutableComposition *mixComposition = [AVMutableComposition composition]; AVMutableCompositionTrack *compositionTrack = [mixComposition

Get PTS from raw H264 mdat generated by iOS AVAssetWriter

一世执手 提交于 2019-12-03 13:24:56
问题 I'm trying to simultaneously read and write H.264 mov file written by AVAssetWriter. I managed to extract individual NAL units, pack them into ffmpeg's AVPackets and write them into another video format using ffmpeg. It works and the resulting file plays well except the playback speed is not right. How do I calculate the correct PTS/DTS values from raw H.264 data? Or maybe there exists some other way to get them? Here's what I've tried: Limit capture min/max frame rate to 30 and assume that

How to set timestamp of CMSampleBuffer for AVWriter writing

一笑奈何 提交于 2019-12-03 13:04:33
问题 I'm working with AVFoundation for capturing and recording audio. There are some issues I don't quite understand. Basically I want to capture audio from AVCaptureSession and write it using AVWriter, however I need some shifting in the timestamp of the CMSampleBuffer I get from AVCaptureSession. I read documentation of CMSampleBuffer I see two different term of timestamp: 'presentation timestamp' and 'output presentation timestamp'. What the different of the two ? Let say I get a CMSampleBuffer

Photos framework: Connection to assetsd was interrupted or assetsd died

心不动则不痛 提交于 2019-12-03 12:55:28
I am getting this error when I am trying to play multiple videos using this swift library ( https://github.com/piemonte/player ). Not sure if it's related to that player, or to the Photos framework or what though. What happens is, I have a view that will display either a photo or a video. Everything works fine a few times until a few videos have played and then this message will pop up, followed by all the videos not being able to play and in their place you just see a black screen, and then I get a memory usage error. I am using a library called SwipeView and here is some relevant code which

Composing video and audio using AVMutableComposition

倖福魔咒の 提交于 2019-12-03 12:53:33
I have a weird problem. In my app I am combining multiple audio and video files using the code below. The resulted video seems to work fine once I downloaded it from the device to the computer and play with Quick Time, but whenever I am trying to play the newly composed video using either UIWebView or AVPLayer I can only see first part of merged video files. Furthermore when I tried to use MPMoviePlayerController to play it hangs on "Loading". I can hear audio for all composition. To make it clear I have two arrays: 1- audioPieces with paths to audio files [song1, song2, song3]; 2- moviePieces

AVPlayer currentTime update for a UISlider when ViewController load

。_饼干妹妹 提交于 2019-12-03 12:50:41
I'm playing songs in AVPlayer . I have created a separate view controller for my media player and initialization, and all the methods that I'm using for the player (play, pause, repeat, shuffle) are there in the same view controller. I update a slider like this: [NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:@selector(sliderUpdate:) userInfo:nil repeats:YES];` - (void) sliderUpdate:(id) sender{ int currentTime = (int)((song.player.currentTime.value)/song.player.currentTime.timescale); slider.value=currentTime; NSLog(@"%i",currentTime); song.currentTime=currentTime; int

How to get video frame of the AVPlayer?

落花浮王杯 提交于 2019-12-03 12:35:32
I have PlayerView class for displaying AVPlayer's playback. Code from documentation . #import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> @interface PlayerView : UIView @property (nonatomic) AVPlayer *player; @end @implementation PlayerView + (Class)layerClass { return [AVPlayerLayer class]; } - (AVPlayer*)player { return [(AVPlayerLayer *)[self layer] player]; } - (void)setPlayer:(AVPlayer *)player { [(AVPlayerLayer *)[self layer] setPlayer:player]; } @end I set up my AVPlayer (contains video asset with size 320x240) in this PlayerView (with frame.size.width = 100, frame.size.height

AVAssetExportSession fails every time (error -12780)

孤人 提交于 2019-12-03 12:22:36
I'm trying to merge some audio files (picked via MPMediaPickerController), but the export always fails with error code -12780. When I try to play my composition with an AVPlayer object, it plays correctly. Just the export fails. What am I doing wrong? This is my code: AVAssetExportSession *exportSession; AVPlayer *player; - (void)mergeAudiofiles { // self.mediaItems is an NSArray of MPMediaItems if (self.mediaItems.count == 0) { UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"No Tracks selected." delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert