How can I output a HDMI 1.4a-compatible stereoscopic signal from an OpenGL application to a 3DTV?

喜你入骨 提交于 2019-12-03 12:53:56

I don't see a direct answer for the question.

HDMI 1.4a defines meta data to describe 3D format. video_format 010 means 3D 3d_structure 0000 frame packing, 0110 top-bottom, 1000 side-by-side

But, if the driver doesn't have an api for that, you need to change its code (assuming it's open or you have access)

If your drivers allow it, you can create a quad-buffer stereo rendering context. This context has two back buffers and two front buffers, one pair for the left eye and one pair for the right. You render to one back buffer (GL_BACK_LEFT), then the other (GL_BACK_RIGHT), then swap them with the standard swap function.

Creating a QBS context requires platform-specific coding. If you're on Windows, you need to pick a pixel format with quad-buffers.

This is only possible if your drivers allow it. They may not. And if they don't there is nothing you can do.

If your OpenGL application happens to use a sufficiently simple subset of OpenGL, the following might work:

  1. Use GLDirect to dynamically convert your OpenGL calls to DirectX.
  2. Use Nvidia 3DTV Play to automatically stereoify and package the signal over HDMI 1.4.
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!