avfoundation

AVCaptureTorchModeAuto does not continuously update torch mode

和自甴很熟 提交于 2019-12-05 12:36:24
I am writing an app that automatically turns on the torch on the back of an iOS device depending on lighting conditions. The app renders a live camera view, and does not record the video. I have tried using AVCaptureTorchModeAuto , but it only seems to measure the brightness of the image at the start of the capture session, and set the torch accordingly. The setting then does not change afterwards, regardless of the brightness of the camera image. It is possible to have the system adjust the torch continuously, like stated in the documentation ? The capture device continuously monitors light

Open camera roll on an exact photo

旧巷老猫 提交于 2019-12-05 12:27:18
I'm developing a camera application where I'd like to put some functions that are already present in the stock app. My problem is replicating the little square in the bottom left corner (in portrait mode) where the micro-thumbnail of the photo that the user has just taken is shown; then, when the user tap on it, the photo app should open, on the last photo saved in the camera roll. I can load the thumbnail of the newest photo in camera roll with ALAssetsLibrary - I've access to it with [UIImage imageWithCGImage:[result thumbnail]] - but even if I have its ALAsset URL, I can't open the photo

repeat AVPlayerItem in AVQueuePlayer

半世苍凉 提交于 2019-12-05 11:51:05
I am using AVQueuePlayer to play my list of videos. I want to play one video continuously unless I call for second video to play. Now, Video1 plays and when it ends, calls for video2 to play which I don't like. Secondly, there is a delay in between two videos. Is there any way of smooth transition from 1 video to second one? tephe Regarding your first question: Set the AVQueuePlayer 's actionAtItemEnd property to AVPlayerActionAtItemEndNone . Then register for AVPlayerItemDidPlayToEndTimeNotification and inside the function that treats this notification call [player seekToTime:kCMTimeZero] ,

MPMediaItem not playing in AVAudioPlayer using MPMediaItemPropertyAssetURL

痴心易碎 提交于 2019-12-05 11:47:35
I have this code, to find and play a MPMediaItem: MPMediaPropertyPredicate *predicate = [MPMediaPropertyPredicate predicateWithValue:self.persistentIDOfSongToPlay forProperty:MPMediaItemPropertyPersistentID comparisonType:MPMediaPredicateComparisonContains]; NSSet *predicateSet = [NSSet setWithObject:predicate]; MPMediaQuery *searchQuery = [[MPMediaQuery alloc] initWithFilterPredicates:predicateSet]; NSArray *queryResults = [searchQuery items]; NSLog(@"count: %i", queryResults.count); MPMediaItem *item = [queryResults objectAtIndex:0]; NSLog(@"item: %@", item); NSURL *itemURL = [item

AVAudioSession alternative on OSX to get audio driver sample rate

僤鯓⒐⒋嵵緔 提交于 2019-12-05 11:38:23
on IOS you can use [[AVAudioSession sharedInstance] sampleRate]; to retrieve the current sample rate used by the audio driver. AVAudioSession does not exist on OSX, so I am wondering how to achieve the same thing on OSX, as I could not find much on the topic. Thanks moka Okay, after some more in depth research Audio Hardware Services seems to do the trick on OSX. Here is some example code: //get the default output device AudioObjectPropertyAddress addr; UInt32 size; AudioDeviceID deviceID = 0; addr.mSelector = kAudioHardwarePropertyDefaultOutputDevice; addr.mScope =

iOS - CoreImage - Add an effect to partial of image

梦想的初衷 提交于 2019-12-05 11:32:01
I just have a look on CoreImage framework on iOS 5, found that it's easy to add an effect to whole image. I wonder if possible to add an effect on special part of image (a rectangle). for example add gray scale effect on partial of image/ I look forward to your help. Thanks, Huy Felix Watch session 510 from the WWDC 2012 videos. They present a technique how to apply a mask to a CIImage . You need to learn how to chain the filters together. In particular take a look at: CICrop , CILinearGradient , CIRadialGradient (could be used to create the mask) CISourceOverCompositing (put mask images

Ios rotate, filter video stream in ios

若如初见. 提交于 2019-12-05 10:53:10
Hello There I am rotating and applying image filters by GPUImage on vide live stream The task is consuming more time than expected resulting over-heating of iPhone Can anybody help me out in optimising my code Following is my used code: - (void)willOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer{ //return if invalid sample buffer if (!CMSampleBufferIsValid(sampleBuffer)) { return; } //Get CG Image from sample buffer CGImageRef cgImageFromBuffer = [self cgImageFromSampleBuffer:sampleBuffer]; if(!cgImageFromBuffer || (cgImageFromBuffer == NULL)){ return; } //We need rotation to perform

How do I use AVAssetWriter?

╄→尐↘猪︶ㄣ 提交于 2019-12-05 10:52:27
问题 I’d like to take some video frames and encode them into a video. It looks like that’s exactly what AVAssetWriter was meant for, but no matter how I eyeball the docs and Google I can’t find any way to actually use it. From the docs it looks like I need an input ( AVAssetWriterInput ) to feed the writer from. Fine. But the AVAssetWriterInput class is abstract and the only subclass that I know of in 4.1 is AVAssetWriterInputPixelBufferAdaptor that requires a AVAssetWriterInput in its initializer

iOS- how to get the duration of .mp4 file by using AVAsset or AVURLAsset

左心房为你撑大大i 提交于 2019-12-05 10:49:53
i know duration of video type questions have been answered before but i am facing real trouble in getting duration of an .mp4 file by using AVAsset and by AVURLAsset . i am using Following code NSString *itemPathString = [[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)objectAtIndex:0] stringByAppendingPathComponent:obmodel.actualname]; NSURL *itemPathURL = [NSURL URLWithString:itemPathString]; if([[NSFileManager defaultManager] fileExistsAtPath:itemPathString]){ NSLog(@"File Exists"); } AVAsset *videoAsset = (AVAsset *)[AVAsset assetWithURL:itemPathURL];

Get a particular frame by time value using AVAssetReader

六眼飞鱼酱① 提交于 2019-12-05 09:52:02
问题 I have checked this http://www.7twenty7.com/blog/2010/11/video-processing-with-av-foundation to get a video frame by frame. But my real requirement is to get a frame at particular time. I know that it should be possible by AVAssetReader , I wonder is there any direct method for this in AVAssetReader . Please give any guidance on how can I get frame at particular time. I checked the AVAssetImageGenerator but this is not the thing I really wanted. Finally I found the answer, you have to use