I am using AVCaptureSession to capture video and get real time frame from iPhone camera but how can I send it to server with multiplexing of frame and sound and
There is a long and a short story to it.
This is the short one: go look at https://github.com/OpenWatch/H264-RTSP-Server-iOS
this is a starting point.
you can get it and see how he extracts the frame. This is a small and simple project.
Then you can look at kickflip which has a specific function "encodedFrame" its called back onces and encoded frame arrives from this point u can do what you want with it, send via websocket. There is a bunch of very hard code avalible to read mpeg atoms