ms-media-foundation

Setting larger GOP size in MediaFoundation hardware MFT

天大地大妈咪最大 提交于 2019-12-22 10:09:36
问题 I'm trying to live stream the desktop that's captured through Desktop duplication API. H264 encoding works fine, except the fact that the Desktop duplication API delivers frames only when there is a screen change, but video encoders expect the frames to be delivered at a constant frame rate. So, I'm forced to save the previous sample to feed the encoder at a constant rate when there is no screen change triggered. This works, I could see live output at the other end. One problem though, the

How to grab constant stream of bitmap images from webcam in c#

青春壹個敷衍的年華 提交于 2019-12-22 08:16:35
问题 We have a c# application that performs processing on video streams. This is a low-level application that receives each frame in Bitmap format, so basically we need 25 images each second. This application is already working for some of our media sources, but we now need to add a webcam as an input device. So we basically need to capture bitmap images from a webcam continuously so that we can pass all these frames as a "stream" to our application. What is the best and simplest way to access the

decode MPEG1/2 video with Media Foundation

前提是你 提交于 2019-12-21 22:09:51
问题 I am using Media Foundation to play videos. On windows 7 some videos encoded with Mpeg1/Mpeg2 PS and Windows Media Player can play them, but Media Foundation does not(I tried to register a stub MPEG1Source but it still does not work). I noticed some of these files could be played with directshow but not Media foundation(can not be open at all). I guess "media player" does not use Media Foundation only? Is it possible to use Media Foundation to play these file? If it is not, how does Media

MFCreateFMPEG4MediaSink does not generate MSE-compatible MP4

拟墨画扇 提交于 2019-12-21 22:00:38
问题 I'm attempting to stream a H.264 video feed to a web browser. Media Foundation is used for encoding a fragmented MPEG4 stream ( MFCreateFMPEG4MediaSink with MFTranscodeContainerType_FMPEG4 , MF_LOW_LATENCY and MF_READWRITE_ENABLE_HARDWARE_TRANSFORMS enabled). The stream is then connected to a web server through IMFByteStream . Streaming of the H.264 video works fine when it's being consumed by a <video src=".."/> tag. However, the resulting latency is ~2sec, which is too much for the

Using Media Foundation to encode Direct X surfaces

只愿长相守 提交于 2019-12-20 11:32:12
问题 I'm trying to use the MediaFoundation API to encode a video but I'm having problems pushing the samples to the SinkWriter. I'm getting the frames to encode through the Desktop Duplication API. What I end up with is an ID3D11Texture2D with the desktop image in it. I'm trying to create an IMFVideoSample containing this surface and then push that video sample to a SinkWriter. I've tried going about this in different ways: I called MFCreateVideoSampleFromSurface(texture, &pSample) where texture

Using Media Foundation to encode Direct X surfaces

六月ゝ 毕业季﹏ 提交于 2019-12-20 11:30:59
问题 I'm trying to use the MediaFoundation API to encode a video but I'm having problems pushing the samples to the SinkWriter. I'm getting the frames to encode through the Desktop Duplication API. What I end up with is an ID3D11Texture2D with the desktop image in it. I'm trying to create an IMFVideoSample containing this surface and then push that video sample to a SinkWriter. I've tried going about this in different ways: I called MFCreateVideoSampleFromSurface(texture, &pSample) where texture

Custom virtual video capture device

不想你离开。 提交于 2019-12-20 05:45:08
问题 I`m new to media foundation and C++. But I want to create a virtual video capture device which can be used by Microsoft Expression Encoder. Can you tell me in which direction to look? I think it should be something working asynchronously and a source will be byte stream from mobile device. Thanks in advance. 回答1: I don't think you want to look into Media Foundation for this. Expression Encoder uses a richer API to capture video with, DirectShow . You want a virtual DirectShow camera, which

IMFActivate::ActivateObject return error code “CoInitialize has not been called.”

萝らか妹 提交于 2019-12-20 03:07:56
问题 I'm writing a simple multimedia application in Visual Studio 2013 and I need to enumerate camera devices connected to my computer and create a media source object to link to one of them. I use Media Foundation SDK and tried to run the guide here: https://msdn.microsoft.com/en-us/library/windows/desktop/dd940326(v=vs.85).aspx : #include <Mfapi.h> #include <mfidl.h> #include <mfobjects.h> #include <iostream> #pragma comment(lib, "Mfplat") #pragma comment(lib, "Mf") template <class T> void

MFT Encoder (h264) High CPU utilization

◇◆丶佛笑我妖孽 提交于 2019-12-20 02:37:05
问题 I am able successfully to encode the data by H264 using Media Foundation Transform (MFT) but unfortunately I got a very high CPU(when I comment in the program the calling of this function I got low CPU).It is few steps followed to get the encoding so I can't do anything to improve it?Any idea can help HRESULT MFTransform::EncodeSample(IMFSample *videosample, LONGLONG llVideoTimeStamp, MFT_OUTPUT_STREAM_INFO &StreamInfo, MFT_OUTPUT_DATA_BUFFER &encDataBuffer) { HRESULT hr; LONGLONG

Get all supported FPS values of a camera in Microsoft Media Foundation

江枫思渺然 提交于 2019-12-19 03:46:00
问题 I want to get a list of all FPS values that my webcam supports. In How to Set the Video Capture Frame Rate msdn article it says that I can query the system for maximum and minimum supported FPS of a particular camera. It also says: The device might support other frame rates within this range. And in MF_MT_FRAME_RATE_RANGE_MIN it says: The device is not guaranteed to support every increment within this range. So it sounds like there is no way to get all of the supported FPS values by the