avfoundation

Why does the position of @autoreleasepool matter?

回眸只為那壹抹淺笑 提交于 2019-12-24 06:39:17
问题 I'm having trouble understanding how @autoreleasepool work. Consider the following example in which I am creating an AVAssetReader for an audiofile. To make the memory impact matter, I repeated this step 1000 times. #import <Foundation/Foundation.h> #import <AVFoundation/AVFoundation.h> void memoryTest() { NSURL *url = [[NSURL alloc] initWithString:@"path-to-mp3-file"]; AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil]; AVAssetReader *reader = [[AVAssetReader alloc]

Get RTSP stream from live555 and decode with AVFoundation

纵然是瞬间 提交于 2019-12-24 06:37:45
问题 I need to get video frames from ip camera using RTSP. To get rtsp stream I use live555. The problem is that I can't find a way to decode incoming video frames with AVFoundation. (I can't use ffmpeg). Is there a way to use AVFoundation for video decoding. If yes then how to do it? 来源: https://stackoverflow.com/questions/17585369/get-rtsp-stream-from-live555-and-decode-with-avfoundation

Why is my Video played in simulator but not on Device (iPad) using AVFoundation Videoplayer?

℡╲_俬逩灬. 提交于 2019-12-24 06:37:15
问题 I built my own custom video player (edit: find example code here) with AVMoviePlayerView.h #import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> @class AVPlayer; @interface AVMoviePlayerView : UIView @property (nonatomic) AVPlayer *player; - (void)setPlayer:(AVPlayer*)player; - (void)setVideoFillMode:(NSString *)fillMode; @end and AVMoviePlayerView.m #import "AVMoviePlayerView.h" #import <CoreMedia/CoreMedia.h> @implementation AVMoviePlayerView + (Class)layerClass { return

Why is my Video played in simulator but not on Device (iPad) using AVFoundation Videoplayer?

ε祈祈猫儿з 提交于 2019-12-24 06:37:15
问题 I built my own custom video player (edit: find example code here) with AVMoviePlayerView.h #import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> @class AVPlayer; @interface AVMoviePlayerView : UIView @property (nonatomic) AVPlayer *player; - (void)setPlayer:(AVPlayer*)player; - (void)setVideoFillMode:(NSString *)fillMode; @end and AVMoviePlayerView.m #import "AVMoviePlayerView.h" #import <CoreMedia/CoreMedia.h> @implementation AVMoviePlayerView + (Class)layerClass { return

How to disconnect AVPlayer from AVPlayerItem?

白昼怎懂夜的黑 提交于 2019-12-24 06:24:07
问题 I want to reuse an AVPlayerItem but keep getting this error: An AVPlayerItem cannot be associated with more than one instance of AVPlayer Before trying to reuse it, I destroy the previous AVPlayer like this: [self.player pause]; [self.player replaceCurrentItemWithPlayerItem:nil]; self.player = nil; why is the AVPlayerItem still associated and how can I disconnect it? Here is a Gist with a full reproduction of the problem (only 50 lines btw): https://gist.github.com/sbiermanlytle

iPhone 5: AVCaptureExposureModeAutoFocus not supported in iOS 7

心不动则不痛 提交于 2019-12-24 06:16:10
问题 I am trying to implement a camera application using AVFoundation. I want to use the AVCaptureExposureModeAutoFocus to set the exposurePointOfInterest at a point, and then lock the exposure as explained by Apple's documentation: AVCaptureExposureModeAutoExpose: the device automatically adjusts the exposure once and then changes the exposure mode to AVCaptureExposureModeLocked. This is the function that I used: -(void)autoExposeAtPoint:(CGPoint)point { AVCaptureDevice *device = [videoInput

Get iOS camera to output Floating-Point Values in Swift 2

蹲街弑〆低调 提交于 2019-12-24 05:11:11
问题 I'm trying to implement the example on this site to get float images back from the camera, but the output image I get is still UInt32 pixels (each = 4 x UInt8 color vals). I can't figure out why the conversion to float32 pixel color components doesn't happen. All I've tweaked is for swift 2.0 updates. Example code collected together: let output = AVCaptureVideoDataOutput() var settings = [kCVPixelBufferPixelFormatTypeKey as NSString:kCVPixelFormatType_32BGRA] output.videoSettings = settings

AVAssetReader, how to use with a stream rather than a file?

旧街凉风 提交于 2019-12-24 04:29:15
问题 AVAssetReader is fantastic, but I can only see how to use it with a local asset, a file, or I guess a composition, So, assetReader = try AVAssetReader(asset: self.asset) ... assetReader.addOutput(readerOutput) and so on, Say you have an arriving stream (perhaps Apple's examples of .M3U8 files, https://developer.apple.com/streaming/examples/ ) In fact, can AVAssetReader be used for streams? Or only local files? I just plain cannot find this explained anywhere. (Maybe it's obvious if you're

Offline audio render with AudioKit for iOS < 11

。_饼干妹妹 提交于 2019-12-24 04:03:33
问题 I have 4 AKPlayer nodes and each one is connected to some effects and finally they are mixed together. I want to render offline the output for iOS > 9.0 but I can't figure out how. edit : I have implemented the render and separated it for iOS >11 While iOS>11 renderToFile seems to do well, but for iOS <11 the rendered file has some lags and jumps forward at some seconds, resulting silent in the end. here is my render function : do{ if #available(iOS 11, *) { let outputFile = try AKAudioFile

Video not always exported to Camera Roll: NSFileManager's removeItemAtPath non-blocking?

孤街浪徒 提交于 2019-12-24 03:44:14
问题 After reading several tutorials like this and looking at other code exporting videos, we still can't resolve an issue. Sometimes a new video gets exported to the Camera Roll, and sometimes it doesn't. We can't even reproduce the problem consistently. The only issue we can imagine is if NSFileManager.defaultManager().removeItemAtPath is not a blocking call, but no documentation suggests it's asynchronous, so we assume it's not the case. Each time, the "Saved video" println inside the