mpeg

What is fixed_frame_rate_flag used for in encoders?

百般思念 提交于 2019-12-11 15:54:52
问题 There is a flag called the fixed_frame_rate_flag used in encoders. I haven't been able to find any clear explanation of what this is, and what its significance is. The flag can either be set to 0 or 1, but I don't know what this would imply. Any help, information would be appreciated. 回答1: Fixed frame rate is the opposite of variable frame rate. If a video is fixed frame rate, every frame has a predictable timestamp. timestamp = frame_number * frame_rate . If its variable, every frame has its

aacplus frame alignment problems

 ̄綄美尐妖づ 提交于 2019-12-11 14:04:07
问题 I have an application that rips aac+ audio streams, cutting them at every regular interval (i.e. 10 minutes). Sometimes files are playable OK, but sometimes, Windows Media Player just closes when trying to build DirectShow graph. I am using Orban aacplus plugin, and it works under directshow. When I play this file with winamp or vlc, that have it's own aacplus decoding engine, it works fine. However, I need it to work under directshow. Anyway, problematic file is here: http://www.videophill

getting mp3 audio signal as array in java

冷暖自知 提交于 2019-12-11 12:12:21
问题 I have been trying to get audio stream of mp3 as array of floating points. I have got the array with the below sample code. I am not sure whether I can use this array for applying FFT. Because this array is not matching[or similar] to the one which I got from C++'s code which uses LAME. import java.io.File; import java.io.IOException; import java.io.PrintStream; import javax.sound.sampled.AudioFormat; import javax.sound.sampled.AudioInputStream; import javax.sound.sampled.AudioSystem; import

How can I concatenate ATSC streams from DVB card?

南楼画角 提交于 2019-12-11 08:09:53
问题 I'm trying to make a simple "TV viewer" using a Linux DVB video capture card. Currently I watch TV using the following process (I'm on a Raspberry Pi): Tune to a channel using azap -r TV_CHANNEL_HERE . This will supply bytes to device /dev/dvb/adapter0/dvr0 . Open OMXPlayer omxplayer /dev/dvb/adapter0/dvr0 Watch TV! The problem comes when I try to change channels. Even if I set the player to cache incoming bytes (tried with MPlayer also), the player can't withstand a channel change (by

MPEG-TS file audio and video frame length

五迷三道 提交于 2019-12-11 02:17:28
问题 Can I discover byte length of each video(I,P,B) and audio frame in MPEG2 TS file not reading frame content byte by byte (from NAL unit header for example)? 回答1: in MPEG-TS files, in the PES layer, there is a field called "PES length" which includes all the data but without the PES headers. To extract this information accurately you must understand if there is a PES header per frame or not. Alas, in video mostly, this field is often left 0, and the only true way to count the bytes per frame is

HTTP Live Streaming MPEG TS segment

烂漫一生 提交于 2019-12-10 17:15:42
问题 I'm using FFMPEG and a free segmenter (Carson Mcdonald's) to produce my ts segments which i later save to a web server and play with Quicktime by playing the .m3u8 If I have segments 1,2,3,4 and another stream with segments 1,2,3,4 and would like to interleave them what should I do. If i do as follows: 1,2, other stream's 3, 4 It works fine. However, If i want to do as follows: 1,2, other stream's 4, 4 it doesn't work. In other words the segment to be added must have the same number as the

How to get Pipeline created by playbin in textual format in Gstreamer?

浪尽此生 提交于 2019-12-09 18:43:32
问题 I'm playing a transport stream file (*.ts) using the following pipeline: gst-launch-0.10 playbin2 uri=file:///c:/bbb.ts But I need to convert that into a pipeline myself. I'm not sure how to achieve this. So far I have tried: (works fine) gst-launch-0.10 -v filesrc location=c:/bbb.ts ! tsdemux ! audio/x-ac3 ! fakesink But if i replace fakesink with autoaudiosink it fails with a not-linked error. And even the fakesink doesn't work for video: gst-launch-0.10 -v filesrc location=c:/bbb.ts !

C# better compression for remote desktop broadcast application

只愿长相守 提交于 2019-12-09 13:53:04
问题 I am in the process of creating a TCP remote desktop broadcasting application. (Something like Team Viewer or VNC) the server application will 1. run on a PC listening for multiple clients on one Thread 2. and on another thread it will record the desktop every second 3. and it will broadcast the desktop for each connected client. i need to make this application possible to run on a connections with a 12KBps upload and 50KBps download DSL connection (client's and server). so.. i have to reduce

Is there a way to play mpeg videos in HTML5?

放肆的年华 提交于 2019-12-06 19:25:23
问题 My pc based web application uses HTML5, and I want to import mpeg files to play in my browser which have been saved that way by other application. Is there a way to play these video files with HTML5? EDIT: The application tries to play the mpeg files from the local hard drive rather than from the server. So, user has an ability to choose the mpeg files to play the selected mpeg files. HTML: <input id="t20provideoinput" type="file" multiple accept="video/*"/> <video id="t20provideo" controls

Php output continuous stream

痴心易碎 提交于 2019-12-06 11:06:14
问题 So I've been experimenting with PHP's fOpen function and I've had a lot of success with reading sound files. Now, I'm trying to create a sort of "relay" for shoutcast streams. File_get_contents is probably the poorest way of doing this, since the data is continuous. Would using php sockets yield better results? Tl;dr What's the best way to output a continuous stream of audio/mpeg data? 回答1: I've done this with PHP and SHOUTcast streams in the past. It's certainly possible, but keep in mind