avfoundation

iOS 11 AVPlayer crash when KVO

元气小坏坏 提交于 2019-12-03 12:12:30
I got a weird crash when using AVPlayer to play a remote video. From the crash log on Fabric , the App crash on system thread ( com.apple.avfoundation.playerlayer.configuration ). The crash log is below: Crashed: com.apple.avfoundation.playerlayer.configuration 0 libsystem_kernel.dylib 0x1839ac2e8 __pthread_kill + 8 1 libsystem_pthread.dylib 0x183ac12f8 pthread_kill$VARIANT$mp + 396 2 libsystem_c.dylib 0x18391afbc abort + 140 3 libsystem_malloc.dylib 0x1839e3ce4 szone_size + 634 4 QuartzCore 0x187ed75e8 -[CALayer dealloc] + 72 5 QuartzCore 0x187e75d90 CA::Transaction::commit() + 1052 6

AV Foundation camera preview layer gets zoomed in, how to zoom out?

一个人想着一个人 提交于 2019-12-03 12:05:13
The application currently I am using has a main functionality to scan QR/Bar codes continuously using Zxing library ( http://code.google.com/p/zxing/ ). For continuous frame capturing I used to initialize the AVCaptureSession and AVCaptureVideoOutput, AVCaptureVideoPreviewLayer described in the apple Q&A http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html . My problem is, when I used to run the camera preview, the image I can see through the Video device is much larger (1.5x) than the image we can see through the still camera of the iPhone. Our customer needs to hold the iPhone

AVCaptureStillImageOutput vs AVCapturePhotoOutput in Swift 3

╄→尐↘猪︶ㄣ 提交于 2019-12-03 11:40:02
问题 I am trying to simply put a Camera View in my View Controller. I imported AVFoundation at the top, as well as UIImagePickerControllerDelegate and UINavigationControllerDelegate classes. However, whenever I try to use AVCaptureStillImageOutput , Xcode tells me that it was deprecated in iOS10 and I should use AVCapturePhotoOutput . That is completely fine, however, as soon as I want to call stillImageOutput.outputSettings , .outputSettings itself is not available. Thus, I have to use

AVSpeechSynthesizer error AudioSession

…衆ロ難τιáo~ 提交于 2019-12-03 11:14:40
问题 I'm playing around with AVSpeechSynthesizer and always getting these errors: ERROR: >aqsrv> 65: Exception caught in (null) - error -66634 ERROR: AVAudioSessionUtilities.h:88: GetProperty_DefaultToZero: AudioSessionGetProperty ('disa') failed with error: '?ytp' My code is: AVSpeechSynthesizer *synthesizer = [[AVSpeechSynthesizer alloc] init]; [synthesizer setDelegate:self]; speechSpeed = AVSpeechUtteranceMinimumSpeechRate; AVSpeechUtterance *synUtt = [[AVSpeechUtterance alloc] initWithString:[

iOS alternative to QTMovieLayer that has non-nil `contents`?

别等时光非礼了梦想. 提交于 2019-12-03 10:48:47
问题 Background QTKit (QuickTime Kit) is a Mac framework from the 10.3 days that got some layer additions in 10.5 like for example QTMovieLayer . One of the nice things with QTMovieLayer is that you can access the movie content using the regular content property on the layer and get a CAImageQueue object back. The nice thing with this is that you can create a bunch of regular CALayer s and set the image queue as their content and give all the layers their own part of the movie by setting the

AVCaptureMovieFileOutput - no active/enabled connections

泪湿孤枕 提交于 2019-12-03 10:36:28
I am trying to record video in my iPhone app using AVFoundation. But whenever I click the Record button app crashes with this message -[AVCaptureMovieFileOutput startRecordingToOutputFileURL:recordingDelegate:] - no active/enabled connections. I know same question asked in SO, but none of its answers helped me. My problem is the same code works with another application perfectly, and when I try using exactly same code in this app - crashes. But still photo capture is working fine. Adding my codes here - please help me, Thanks in advance -(void)viewDidLoad { [super viewDidLoad]; self

m3u8 file AVAssetImageGenerator error

痞子三分冷 提交于 2019-12-03 10:35:06
I am using AVPlayer to play .m3u8 file. Using AVAssetImageGenerator to extract image out of it using following code : AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:mp.contentURL options:nil]; AVAssetImageGenerator *generate1 = [[AVAssetImageGenerator alloc] initWithAsset:asset1]; generate1.appliesPreferredTrackTransform = YES; NSError *err = NULL; CMTime time = CMTimeMake(1, 2); CGImageRef oneRef = [generate1 copyCGImageAtTime:time actualTime:NULL error:&err]; img = [[UIImage alloc] initWithCGImage:oneRef]; It always gives me error : Error Domain=AVFoundationErrorDomain Code=-11800 "The

Adding text subtitles to video track (in Swift) fails with error code -11841

空扰寡人 提交于 2019-12-03 10:20:29
问题 I have been struggling with adding text subtitles to videos for a while. I have added some links that I referred in detail, but they are not helping. In below code, I am trying to add a subtitle to a video. The output file path is as below: file:///var/mobile/Applications/03E49B29-1070-4541-B7CB-B1366732C179/Documents/output_movie.mov In addition, the input file was recorded with a call to UIPickerView in the same application, at temporary path below: file:///private/var/mobile/Applications

Cross-fade within AVMutableVideoComposition

依然范特西╮ 提交于 2019-12-03 10:17:29
问题 I have successfully composed an AVMutableComposition with multiple video clips and can view it and export it, and I would like to be able to transition between them using a cross-fade, so I want to use AVMutableVideoComposition. I can't find any examples on how to even arrange and play a couple AVAsset videos in succession. Does anyone have an example of how to add tracks to an AVMutableVideoComposition with the equivalent of AVMutableComposition's insertTimeRange, or how to set up a cross

How to crop video into square iOS with AVAssetWriter

可紊 提交于 2019-12-03 10:17:14
问题 I'm using AVAssetWriter to record a video and I want to be able to crop the video into a square with a offset from the top. Here is my code - NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys: @320, AVVideoCleanApertureWidthKey, @320, AVVideoCleanApertureHeightKey, @10, AVVideoCleanApertureHorizontalOffsetKey, @10, AVVideoCleanApertureVerticalOffsetKey, nil]; NSDictionary *videoAspectRatioSettings = [NSDictionary dictionaryWithObjectsAndKeys: @3,