Decode android's hardware encoded H264 camera feed using ffmpeg in real time
问题 I'm trying to use the hardware H264 encoder on Android to create video from the camera, and use FFmpeg to mux in audio (all on the Android phone itself) What I've accomplished so far is packetizing the H264 video into rtsp packets, and decoding it using VLC (over UDP ), so I know the video is at least correctly formatted. However, I'm having trouble getting the video data to ffmpeg in a format it can understand. I've tried sending the same rtsp packets to a port 5006 on localhost (over UDP),