rtsp

Gstreamer rtsp server linking in Qt Creator

别等时光非礼了梦想. 提交于 2019-12-25 08:48:38
问题 I've installed Gstreamer sdk and trying to compile this code: #include <gst/gst.h> #include <gst/rtsp-server/rtsp-server.h> int main (int argc, char *argv[]) { GMainLoop *loop; GstRTSPServer *server; GstRTSPMediaMapping *mapping; GstRTSPMediaFactory *factory; gst_init (&argc, &argv); loop = g_main_loop_new (NULL, FALSE); server = gst_rtsp_server_new (); mapping = gst_rtsp_server_get_media_mapping (server); factory = gst_rtsp_media_factory_new (); gst_rtsp_media_factory_set_launch (factory, "(

J2ME RTSP Video Streams but no Audio

徘徊边缘 提交于 2019-12-25 05:23:21
问题 I've followed Nokia's wiki about creating the video player with J2Me. Generally the code is like these player = Manager.createPlayer("rtsp://v1.cache5.c.youtube.com/CjYLENy73wIaLQm8E_KpEOI9cxMYDSANFEIJbXYtZ29vZ2xlSARSBXdhdGNoYLm0hv_ig5HRTww=/0/0/0/video.3gp"); //A player listener is needed so that we know, when the video has reached the END_OF_MEDIA position player.addPlayerListener(this); player.realize(); player.prefetch(); //The duration of the video duration = player.getDuration(); //The

J2ME RTSP Video Streams but no Audio

独自空忆成欢 提交于 2019-12-25 05:23:06
问题 I've followed Nokia's wiki about creating the video player with J2Me. Generally the code is like these player = Manager.createPlayer("rtsp://v1.cache5.c.youtube.com/CjYLENy73wIaLQm8E_KpEOI9cxMYDSANFEIJbXYtZ29vZ2xlSARSBXdhdGNoYLm0hv_ig5HRTww=/0/0/0/video.3gp"); //A player listener is needed so that we know, when the video has reached the END_OF_MEDIA position player.addPlayerListener(this); player.realize(); player.prefetch(); //The duration of the video duration = player.getDuration(); //The

feed raw yuv frame to ffmpeg with timestamp

耗尽温柔 提交于 2019-12-25 04:01:27
问题 I've trying pipe audio and video raw data to ffmpeg and push realtime stream through RTSP protocol on android. the command-line is look like this "ffmpeg -re -f image2pipe -vcodec mjpeg -i "+vpipepath + " -f s16le -acodec pcm_s16le -ar 8000 -ac 1 -i - " + " -vcodec libx264 " + " -preset slow -pix_fmt yuv420p -crf 30 -s 160x120 -r 6 -tune film " + " -g 6 -keyint_min 6 -bf 16 -b_strategy 1 " + " -acodec libopus -ac 1 -ar 48000 -b:a 80k -vbr on -frame_duration 20 " + " -compression_level 10

feed raw yuv frame to ffmpeg with timestamp

耗尽温柔 提交于 2019-12-25 04:01:20
问题 I've trying pipe audio and video raw data to ffmpeg and push realtime stream through RTSP protocol on android. the command-line is look like this "ffmpeg -re -f image2pipe -vcodec mjpeg -i "+vpipepath + " -f s16le -acodec pcm_s16le -ar 8000 -ac 1 -i - " + " -vcodec libx264 " + " -preset slow -pix_fmt yuv420p -crf 30 -s 160x120 -r 6 -tune film " + " -g 6 -keyint_min 6 -bf 16 -b_strategy 1 " + " -acodec libopus -ac 1 -ar 48000 -b:a 80k -vbr on -frame_duration 20 " + " -compression_level 10

Dump RTSP to file like rtmpdump

拈花ヽ惹草 提交于 2019-12-25 02:50:56
问题 How to dump an RTSP stream to a file? For RTMP I can do this: rtmpdump --quiet --start=0 --stop=10 -rtmp=[Path to stream] --flv=dump.f4v I need to do the same for RTSP. I'm on OS X and have access to VLC, python and ffmpeg. I only need to save a small 10 second sample of the stream to test a server. 回答1: It is not useful to "dump" an "RTSP stream" to a file. RTSP is a bidirectional conversation between a client and the server. The byte content changes every time it is run, so you can't replay

OpenCv + Gstreamer from an app, getting initial 30s delay

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-24 18:56:42
问题 So my application is exposing an RTP stream using new VideoWriter(pipeline-definition); The pipeline definition is: appsrc is-live=1 do-timestamp=1 format=3 stream-type=0 min-latency=0 max-latency=500000000 ! queue leaky=2 max-size-time=500000000 ! videoconvert ! video/x-raw ! x264enc ! h264parse ! rtph264pay config-interval=10 pt=96 ! udpsink host=127.0.0.1 port=9000 The problem I'm faced with is 30s delay in the stream when viewing it in VLC. No matter what I do, VLC is always 29-30s behind

RTSP Frame Grabbing creates smeared , pixeled and corrupted images

六眼飞鱼酱① 提交于 2019-12-24 14:39:12
问题 I am trying to capture a single frame per second from a RTSP stream with following command ffmpeg -i rtsp://XXX -q:v 1 -vf fps=fps=1 -strftime 1 ZZZZ\%H_%M_%S.jpg But some of the frames are smeared ,pixeled and corrupted - this effect is drastically increases if rtsp resolution is increased (if the resolution is decreased for example to 720P most of the frames are OK) I have to say that playing same rtsp stream in VLC or FFPLAY is flowless. How I can fix it to grab better quality Thanks in

Android LibVLC options do not work

谁都会走 提交于 2019-12-24 12:07:00
问题 I am working with streaming RTSP using LibVLC. I have it working where I can view the stream, but the latency is set to the default ~2seconds. On the Ubuntu Desktop I can launch vlc with the following options to improve the latency greatly: $ vlc -vvv rtsp://192.168.2.1:1234 --network-caching=50 --clock-jitter=0 --clock-synchro=0 however, when I add these options to LibVLC, there is no positive effect: ArrayList<String> options = new ArrayList<>(); options.add("-vvv"); options.add("--network

How to integrate Live555 in XCode (iOS SDK)

烈酒焚心 提交于 2019-12-24 09:50:04
问题 I have to implement the live streaming from iphone to wowza server using rtsp h264. I did search and found one library Live555. I created the .a files along with include headers. But I am not able to use them in my XCode. As I used then then it start giving errors in understanding the c++ keyword "class". This is maybe because of .hh files. Is anyone having idea, how to include live555 in ios application. Thanks in advance... 来源: https://stackoverflow.com/questions/19142363/how-to-integrate