avcapturesession

Turn on torch/flash on iPhone

若如初见. 提交于 2019-12-16 22:35:59
问题 I know that the only way to turn on the flash and keep it on on iPhone 4 is by turning the video camera on. I'm not too sure of the code though. Here is what I am trying: -(IBAction)turnTorchOn { AVCaptureSession *captureSession = [[AVCaptureSession alloc] init]; AVCaptureDevice *videoCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; NSError *error = nil; AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&error

How to save a movie from AVCapture

丶灬走出姿态 提交于 2019-12-13 04:34:04
问题 I've been trying to figure out AVCapture the last couple of days and am struggling to save a video. My understanding is that you call [movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self]; and then at a later time you can call [movieFileOutput stopRecording]; And it should then call the delegate method -(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:

capture image is stretched using avcapture session

寵の児 提交于 2019-12-13 00:48:23
问题 I am taking picture using avcaptureSession The images are right below . M i using right approach or is there something wrong? I also change preset but no sucess Here is the image before taking picture output is like that(stretched) My Code is: AVCaptureDeviceInput* input1 = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil]; AVCaptureVideoDataOutput* output1 = [[AVCaptureVideoDataOutput alloc] init]; output1.alwaysDiscardsLateVideoFrames = YES; dispatch_queue_t queue; queue =

Switch between front and back camera

社会主义新天地 提交于 2019-12-12 19:24:19
问题 I'm trying to make a custom cameraView, which works so far. However i've reached an issue with switching between front and back camera. I've tried to handle it through a custom enum. However when the switchCamera method is called. it just seem to freeze the camera? How come is that? Camera Variable var camera = CameraType.Back viewDidLoad switchButton = UIButton(frame: CGRectMake(rightButtonXPoint, 35, 30, 30)) switchButton.setImage(UIImage(named: "Switch"), forState: UIControlState.Normal)

AVCaptureStillImageOutput area selection

拥有回忆 提交于 2019-12-12 19:01:57
问题 Here is a challenge I am facing, when saving an image taken from the camera, in an iOS app written in Swift. I am not saving exactly what I want, I am saving more than necessary and it is not good. These are the two relevant chunks of code for this issue. First where the session starts, as one can see I am only looking at an ellipse-shaped area: previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) previewLayer?.frame = self.view.layer.frame self.view.layer.addSublayer

Music App (playing in background) breaks while recording video using AVCaptureSession

醉酒当歌 提交于 2019-12-12 17:32:19
问题 I am working on snapchat app like functionality. In my application, music app is running in the background and my app is in foreground state. I have to capture video simultaneously with music app running in the background. Problem :- when my app comes in foreground state, the music app audio break for a second and then continue. Same problem arises when user tap HOME button, app went to background state and the music app audio break for a second. I have downloaded the apple sample code for

SwiftyCam capture session is not running

北城余情 提交于 2019-12-12 11:08:01
问题 I have created my camera view following the demo project on GitHub from SwiftyCam Everything lays out correctly and is going well; however, when the camera button is pushed I get a message in the console saying "[SwiftyCam]: Cannot take photo. Capture session is not running". There have been other people having this problem with swift 4 and you can find that here. I have gone through the whole framework line by line but for some reason I can't figure it out. I would really appreciate it if

AVCaptureSession only getting one frame for iPhone 3gs

荒凉一梦 提交于 2019-12-12 10:45:16
问题 I have a piece of code that sets up a capture session from the camera to process the frames using OpenCV and then set the image property of a UIImageView with a generated UIImage from the frame. When the app starts, the image view's image is nil and no frames show up until I push another view controller on the stack and then pop it off. Then the image stays the same until I do it again. NSLog statements show that the callback is called at approximately the correct frame rate. Any ideas why it

How to record timelapse video from iPhone using AVCaptureDeviceFormat?

南笙酒味 提交于 2019-12-12 03:27:24
问题 I am trying to record a timelapse video via iphone. I have already got it working for slow motion (by capturing maximum frames possible), with 120 fps. Now I am trying to reverse the logic to capture least frames possible, to achieve timelapse functionality. Code flow is like this: Query all the supported frame range from the available device formats of the AVCaptureDevice. Check if the frame rate is below or equal to the desired 20 fps and dimensions are equal to or greater than 1920*1080.

The iPhone 3GS and kCVPixelFormatType_420YpCbCr8BiPlanarFullRange format?

删除回忆录丶 提交于 2019-12-12 01:01:04
问题 I'm developing an iOS application with latest SDK and testing it on an iPhone 3GS. I'm doing this on init method: CFDictionaryRef formatDictionary = CVPixelFormatDescriptionCreateWithPixelFormatType(kCFAllocatorDefault, kCVPixelFormatType_420YpCbCr8BiPlanarFullRange); CFNumberRef val = (CFNumberRef) CFDictionaryGetValue(formatDictionary, kCVPixelFormatBitsPerBlock); if (val != nil) { CFNumberGetValue(val,kCFNumberSInt8Type, &_bytesPerPixel); _bytesPerPixel /= 8; } else _bytesPerPixel = 4; But