How can live video be streamed from an iOS device to a server? [closed]

隐身守侯 提交于 2019-12-11 06:18:36

问题


I want to be able to stream live video from an iOS device to a server. I tried to use an AVCaptureOutput that captures each frame as a CMSampleBuffer and appends it using an AVAssetWriter, but I don't know when or how to take the input from the file and send it to the server. How should it be formatted? How do I know when to send it?


回答1:


Though i am not sharing any code with you, I am sharing my logic with you what i have done in one of my app.

First way(The easy one): There are lots of low cost third party library available for your use.

Second way(The hard one): Create small chunk of video for example 2sec or less, keep them in queue and upload it on the server(don't use afnetworking or http method it will slow down the process use some chat server like node.js or other). And keep one text file or db entry where you keep the track of the chunk file and its sequence. And once your first chunk is uploaded you can use ffmpg to make a video from the actual chunk, the more chunk you got add them in the main video file, and if you play the actual video on the device you don't have to do any more modification it will automatically fetch the new part once it is changed on the server.

Thank You. Hope it helps you.



来源:https://stackoverflow.com/questions/37960571/how-can-live-video-be-streamed-from-an-ios-device-to-a-server

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!