avfoundation

How to record video of screen like Talking Tom Cat on iPhone?

人盡茶涼 提交于 2019-12-18 11:11:13
问题 I want to know if there is any public API in AVFoundation or any other framework which can be used to record screen like Talking Tom Cat does. I looked into AVFoundation and CoreVideo frameworks but could not find anything from the header files. If anyone knows how to record screen video prgramatically using iPhone SDK, let me know. 回答1: you can do it in the following steps: capture screen take that frame in a queue write it with avassertwriter and export the video there is a sample

Detecting edges of a card with rounded corners

大城市里の小女人 提交于 2019-12-18 10:55:33
问题 Hi currently i am working on an OCR reading app where i have successfully able to capture the card image by using AVFoundation framework. For next step, i need to find out edges of the card , so that i can crop the card image from main captured image & later i can sent it to OCR engine for processing. The main problem is now to find the edges of the card & i am using below code(taken from another open source project) which uses OpenCV for this purpose.It is working fine if the card is pure

Capturing zoomed preview view in AVFoundation

随声附和 提交于 2019-12-18 10:32:10
问题 I am working with zoom functionality in AVFoundation camera, i have implemented zoom by scaling the view that has AVCaptureVideoPreviewLayer. Now i want to capture the zoomed image. here is my code for adding AVFoundationVideoPreviewLayer to view: // create a uiview subclass for showing the camera feed UIView *previewView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 320, 430)]; [[self view] addSubview:previewView]; CGRect layerRect = CGRectMake(0, 0, 320, 430); [[self

HTTP live streaming server on iPhone

北战南征 提交于 2019-12-18 10:30:16
问题 I am trying to run a HTTP live streaming server on iPhone, which captures the video stream from the camera and feed it to the HTML5 client (which supports HTTP Live Streaming). So far, I've got following working. HTTP Live streaming server on iOS (written in Node.js), which dynamically updates the index file from the list of Transport Stream (video/MP2T) files generated by video capture module. Video capture module, which uses AVCaptureMovieFileOutput to produce a series of 10-second

HTTP live streaming server on iPhone

夙愿已清 提交于 2019-12-18 10:30:07
问题 I am trying to run a HTTP live streaming server on iPhone, which captures the video stream from the camera and feed it to the HTML5 client (which supports HTTP Live Streaming). So far, I've got following working. HTTP Live streaming server on iOS (written in Node.js), which dynamically updates the index file from the list of Transport Stream (video/MP2T) files generated by video capture module. Video capture module, which uses AVCaptureMovieFileOutput to produce a series of 10-second

AVCaptureVideoPreviewLayer doesn't fill up whole iPhone 4S Screen

青春壹個敷衍的年華 提交于 2019-12-18 10:10:42
问题 AVCaptureVideoPreviewLayer *avLayer = [AVCaptureVideoPreviewLayer layerWithSession:session]; avLayer.frame = self.view.frame; [self.view.layer addSublayer:avLayer]; I use AVCaptureVideoPreviewLayer to display video on the view. But the video's view didn't fill up the full iPhone4's screen.(two grey bar at the right&left side) I want the video fills up full screen. How can I deal with it? Thank you very much! 回答1: Maybe this solves it? CGRect bounds=view.layer.bounds; avLayer.videoGravity =

Updating AVPlayerItem video composition

倾然丶 夕夏残阳落幕 提交于 2019-12-18 09:09:07
问题 Me and my team are stuck on an issue for a few days now. We have AVPlayer which plays AVPlayerItem with a custom AVVideoComposition . player.currentItem?.videoComposition = getUpdatedVideoComisition() The user can decide if to remove or keep the composition. I'm trying to remove the composition by calling player.currentItem?.videoComposition = nil But in an unexpected way even when assigning nil to the videoComposition property, it's keep on playing with the same composition. How can we

How to get file size and current file size from NSURL for AVPlayer iOS4.0

*爱你&永不变心* 提交于 2019-12-18 07:12:43
问题 self.player = [[AVPlayer playerWithURL:[NSURL URLWithString:@"http://myurl.com/track.mp3"]] retain]; I am trying make a UIProgressView for the above track. How do I obtain the file size and current file size from that URL? Please help, thanks! 回答1: You need to start observing the loadedTimeRanges property of the current item, like this: AVPlayerItem* playerItem = self.player.currentItem; [playerItem addObserver:self forKeyPath:kLoadedTimeRanges options:NSKeyValueObservingOptionNew context

AVAudioSession manipulate sound output

空扰寡人 提交于 2019-12-18 07:08:47
问题 I'm using AVSoundSession to configure sound, and AVAudioPlayer to play different sounds. I searched a lot and couldn't find anything. How can I manipulate output sources? I need a method in my SoundManager where I could switch output between phone speaker and loudspeaker. success = [session overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:&error]; Using this I can route sound to loudspeaker, but there is no method to move it to phone speaker. Can anybody help me with it? 回答1:

Swift iOS -AVPlayer Video Freezes / Pauses When App Comes Back from Background

怎甘沉沦 提交于 2019-12-18 06:19:10
问题 I have a video playing in a loop on the login page of my app. I followed this Youtube tutorial to get it to work loop video in view controller The problem is when the app goes to the background, if I don't come back right away, when i do come back the video gets frozen. According to the Apple Docs that's supposed to happen. I tried to use the NotificationCenter's Notification.Name.UIApplicationWillResignActive but that didn't work. How do I get the video to keep playing once the app returns