avfoundation

Inserting an HTTP stream into a AVMutableComposition

不问归期 提交于 2019-11-29 06:49:24
问题 I am trying to insert an AVURLAsset of a AVPlayerItem that states AVPlayerItemStatusReadyToPlay into an AVMutableComposition like this: composition_ = [[AVMutableComposition alloc] init]; insertionPoint_ = kCMTimeZero; item_ = [[AVPlayerItem playerItemWithURL:[NSURL URLWithString:@"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"]] retain]; [item_ addObserver:self forKeyPath:@"status" options:0 context:nil]; player_ = [[AVPlayer playerWithPlayerItem:item_] retain]; [player_

How to detect fullscreen mode of AVPlayerViewController

喜你入骨 提交于 2019-11-29 06:29:33
How can I detect when the user press the expand icon of the AVPlayerViewController? I want to know when the movie playing is entering the fullscreen mode. It is also possible to observe bounds of playerViewController.contentOverlayView and compare that to [UIScreen mainScreen].bounds , e.g.: self.playerViewController = [AVPlayerViewController new]; // do this after adding player VC as a child VC or in completion block of -presentViewController:animated:completion: [self.playerViewController.contentOverlayView addObserver:self forKeyPath:@"bounds" options:NSKeyValueObservingOptionNew |

Cropping image captured by AVCaptureSession

半世苍凉 提交于 2019-11-29 06:24:32
I'm writing an iPhone App which uses AVFoundation to take a photo and crop it. The App is similar to a QR code reader: It uses a AVCaptureVideoPreviewLayer with an overlay. The overlay has a square. I want to crop the image so the cropped image is exactly what the user has places inside the square. The preview layer has gravity AVLayerVideoGravityResizeAspectFill. It looks like what the camera actually captures is not exactly what the user sees in the preview layer. This means that I need to move from the preview coordinate system to the captured image coordinate system so I can crop the image

App crashes when playing audio on iOS13.1

六眼飞鱼酱① 提交于 2019-11-29 06:20:44
问题 I am building an app that runs sound files from within the main bundle with a url. When I tested this on iOS 13, everything is fine. But with the new update of 13.1 I am getting an error here on the line of code backgroundMusicPlayer = try AVAudioPlayer(contentsOf: URL(fileURLWithPath: sound!)) that says: Thread 1: EXC_BAD_ACCESS (code=1, address=0x48 Here is the code that I am using in a custom class that runs background music when the app launches: import Foundation import AVFoundation var

iOS AVAudioSession interruption notification not working as expected

这一生的挚爱 提交于 2019-11-29 05:35:34
I want to know when my AVAudioRecorder is inaccessible (e.g when music starts playing). As audioRecorderEndInterruption will be deprecated with iOS 9 I am focusing on AVAudioSession 's interruption notification (but neither is working as expected). The issue is that the interruption notification is never called if the app was and remains in the foreground when the interruption occurs. E.g: The user starts and stops playing music without moving the application into the background. To detect any interruptions I am using: [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector

How can I capture an image from iOS camera without user interaction?

岁酱吖の 提交于 2019-11-29 05:16:58
I had the need to capture a still image from the front facing camera and store it in the Documents directory. I found bits and pieces of code on other posts, but wanted to share this in case others have a similar need. Define a UIImage property and make sure your class implements the AVCaptureVideoDataOutputSampleBufferDelegate protocol: @interface ViewController : UIViewController <AVCaptureVideoDataOutputSampleBufferDelegate> @property (nonatomic, strong) UIImage *theImage; @end In viewDidLoad or something appropriate, add this: [self captureImage]; Implement the following methods: - (void

AVAssetWriter How to create mov video with no compression?

痞子三分冷 提交于 2019-11-29 05:12:55
I'am creating a video from an array of images. The purpose of my work is to create a .mov video with no compression. I have see in developer library that it exist a key "AVVideoCompressionPropertiesKey" but I don't know how to specify no compression with this key. Could you help me please? Here is my sample code : NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:320], AVVideoCleanApertureWidthKey, [NSNumber numberWithInt:480], AVVideoCleanApertureHeightKey, [NSNumber numberWithInt:10], AVVideoCleanApertureHorizontalOffsetKey,

Captured photo is stretched with AVCaptureSession sessionPreset = AVCaptureSessionPresetPhoto

大憨熊 提交于 2019-11-29 04:35:45
IMPORTANT: if I use: session.sessionPreset = AVCaptureSessionPresetHigh; my preview image is not stretched !! If I save the photo to the device UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil); the image is normal, only in the preview it is stretched. I m using AVFoundation to capture photo. session = [[AVCaptureSession alloc] init]; session.sessionPreset = AVCaptureSessionPresetHigh; CALayer *viewLayer = vImagePreview.layer; NSLog(@"viewLayer = %@", viewLayer); AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];

How to watermark your video with different images and different CMTimes using AVFoundation

谁都会走 提交于 2019-11-29 04:10:51
问题 I'm using AVFoundation to put a watermark in my movies. This works well with the code that's been going around on the internet and Apple. But I don't want to show the watermark the complete time and I want to show different watermarks in the same movie. I've an AVAsset: NSString *path = [[NSBundle mainBundle] pathForResource:@"test" ofType:@"MOV"]; NSURL *url = [[NSURL alloc] initFileURLWithPath: path]; avasset_camera = [AVAsset assetWithURL:url]; An AVMutableComposition: AVMutableComposition

How do I control AVAssetWriter to write at the correct FPS

此生再无相见时 提交于 2019-11-29 04:08:54
Let me see if I understood it correctly. At the present most advanced hardware, iOS allows me to record at the following fps: 30, 60, 120 and 240. But these fps behave differently. If I shoot at 30 or 60 fps, I expect the videos files created from shooting at these fps to play at 30 and 60 fps respectively. But if I shoot at 120 or 240 fps, I expect the video files creating from shooting at these fps to play at 30 fps, or I will not see the slow motion. A few questions: am I right? is there a way to shoot at 120 or 240 fps and play at 120 and 240 fps respectively? I mean play at the fps the