How to get real time video stream from iphone camera and send it to server?

后端 未结 3 1710
情歌与酒
情歌与酒 2020-12-12 21:11

I am using AVCaptureSession to capture video and get real time frame from iPhone camera but how can I send it to server with multiplexing of frame and sound and

3条回答
  •  再見小時候
    2020-12-12 21:46

    The way I'm doing it is to implement an AVCaptureSession, which has a delegate with a callback that's run on every frame. That callback sends each frame over the network to the server, which has a custom setup to receive it.

    Here's the flow:

    http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/03_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW2

    And here's some code:

    // make input device
    
    NSError *deviceError;
    
    AVCaptureDevice *cameraDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    
    AVCaptureDeviceInput *inputDevice = [AVCaptureDeviceInput deviceInputWithDevice:cameraDevice error:&deviceError];
    
    // make output device
    
    AVCaptureVideoDataOutput *outputDevice = [[AVCaptureVideoDataOutput alloc] init];
    
    [outputDevice setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
    
    // initialize capture session
    
    AVCaptureSession *captureSession = [[[AVCaptureSession alloc] init] autorelease];
    
    [captureSession addInput:inputDevice];
    
    [captureSession addOutput:outputDevice];
    
    // make preview layer and add so that camera's view is displayed on screen
    
    AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer    layerWithSession:captureSession];
    previewLayer.frame = view.bounds;
    [view.layer addSublayer:previewLayer];
    
    // go!
    
    [captureSession startRunning];
    

    Then the output device's delegate (here, self) has to implement the callback:

    -(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
    
    {
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer( sampleBuffer );
    
    CGSize imageSize = CVImageBufferGetEncodedSize( imageBuffer );
    
    // also in the 'mediaSpecific' dict of the sampleBuffer
    
       NSLog( @"frame captured at %.fx%.f", imageSize.width, imageSize.height );
    
        }
    

    Sending raw frames or individual images will never work well enough for you (because of the amount of data and number of frames). Nor can you reasonably serve anything from the phone (WWAN networks have all sorts of firewalls). You'll need to encode the video, and stream it to a server, most likely over a standard streaming format (RTSP, RTMP). There is an H.264 encoder chip on the iPhone >= 3GS. The problem is that it is not stream oriented. That is, it outputs the metadata required to parse the video last. This leaves you with a few options.

    1) Get the raw data and use FFmpeg to encode on the phone (will use a ton of CPU and battery).

    2) Write your own parser for the H.264/AAC output (very hard).

    3) Record and process in chunks (will add latency equal to the length of the chunks, and drop around 1/4 second of video between each chunk as you start and stop the sessions).

提交回复
热议问题