I am working on live device to server streaming in android. I am able to send data in bytes on server but when i play that file during recording on server VLC
the mp4 format needs the moov atom information to play the video, and to generate the moov atom the video must be finished, you can't play a mp4 file while it is recording because you still don't have all the information to create the moov atom part.
What you want to do is some kind of real-time-streaming (play while is recroding) so you need to use another formats. HLS streaming and mpeg-dash stores the video in tiny chunks (2seconds to 10seconds) and send to the users, this way the users plays many finished files one after the other.
as @Sebastian Annies suggested, to create many tiny mp4 files and concatenate is the same approach: to have tiny finished files and play as a list, here you could get more information What exactly is Fragmented mp4(fMP4)? How is it different from normal mp4?