opengl-3

How to create OpenGL 3.x or 4.x context in Ruby?

孤街醉人 提交于 2019-12-05 14:45:02
I've looked everywhere, but there seam to be no ruby binding that would allow to create OpenGL 3/4 context. It do not have to be complete OpenGL binding library, just portion that creates OpenGL context. Update: If I'm desperate enough I'll do partial glfw ruby binding with ruby ffi. So pls save me from that ;) I have written a library that can create an OpenGL 3 or 4 context on most platforms (provided it is supported). It's called Ray . You don't have to use its drawable system or DSL, you can just use it to create a window and an OpenGL context. Ray::GL.major_version = 3 Ray::GL.minor

wglCreateContextAttribsARB fails on NVIDIA Hardware

让人想犯罪 __ 提交于 2019-12-05 13:58:37
ContextWin32::ContextWin32(WindowHandle parent, NLOpenGLSettings settings) : IPlatformContext(parent, settings) { int pf = 0; PIXELFORMATDESCRIPTOR pfd = {0}; OSVERSIONINFO osvi = {0}; osvi.dwOSVersionInfoSize = sizeof(OSVERSIONINFO); // Obtain HDC for this window. if (!(m_hdc = GetDC((HWND)parent))) { NLError("[ContextWin32] GetDC() failed."); throw NLException("GetDC() failed.", true); } // Create and set a pixel format for the window. pfd.nSize = sizeof(pfd); pfd.nVersion = 1; pfd.dwFlags = PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER; pfd.iPixelType = PFD_TYPE_RGBA; pfd

Instanced drawing of dynamic models in OpenGL

你离开我真会死。 提交于 2019-12-05 12:41:05
I am currently developing a framework that allows me to conveniently render a larger number of animated models. A model is organized as a simple hierarchy of bones, with the root being the torso/pelvis, generally: So, as pseudo code, I am currently rendering a model like this: RenderBone(Bone b, Mat4x4 currentTransform){ Mat4x4 pos = currentTransform * b.boneTransform; SetUniform("transformation", pos); Draw(bone.mesh); for each Bone bc in b.children do{ RenderBone(bc, pos); } } So for a single actor that uses a model with n bones I need n SetUniform (not counting stuff like setting textures)

Why does OpenGL's glDrawArrays() fail with GL_INVALID_OPERATION under Core Profile 3.2, but not 3.3 or 4.2?

喜夏-厌秋 提交于 2019-12-05 11:20:38
I have OpenGL rendering code calling glDrawArrays that works flawlessly when the OpenGL context is (automatically / implicitly obtained) 4.2 but fails consistently (GL_INVALID_OPERATION) with an explicitly requested OpenGL core context 3.2. (Shaders are always set to #version 150 in both cases but that's beside the point here I suspect.) According to specs, there are only two instances when glDrawArrays() fails with GL_INVALID_OPERATION: "if a non-zero buffer object name is bound to an enabled array and the buffer object's data store is currently mapped" -- I'm not doing any buffer mapping at

No OpenGL 3 headers in Arch Linux

女生的网名这么多〃 提交于 2019-12-05 11:11:17
I’m trying to compile a very simple “Hello world” OpenGL 3.3 program using FreeGLUT. In all the tutorials I found, they include an header “gl3.h”. The problem is, I don’t have such header file. $ ls -l /usr/include/GL/ total 2164 -rw-r--r-- 1 root root 8797 20 janv. 17:44 freeglut_ext.h -rw-r--r-- 1 root root 681 20 janv. 17:44 freeglut.h -rw-r--r-- 1 root root 26181 20 janv. 17:44 freeglut_std.h -rw-r--r-- 1 root root 837247 27 janv. 12:55 glew.h -rw-r--r-- 1 root root 656589 21 mars 18:07 glext.h -rw-r--r-- 1 root root 84468 21 mars 18:07 gl.h -rw-r--r-- 1 root root 128943 21 mars 18:07 gl

How do I make this simple OpenGL code (works in a “lenient” 3.3 and 4.2 profile) work in a strict 3.2 and 4.2 core profile?

一笑奈何 提交于 2019-12-05 05:39:14
I had some 3D code that I noticed wouldn't render in a strict core profile but fine in a "normal" (not explicitly requested-as-core-only) profile context. To isolate the issue, I have written the smallest simplest possible OpenGL program drawing just a triangle and a rectangle: I have posted that OpenGL program as a Gist here . With the useStrictCoreProfile variable set to false, the program outputs no error messages to the console and draws a quad and a triangle as per the above screenshot, both on an Intel HD OpenGL 3.3 and on a GeForce with OpenGL 4.2. However , with useStrictCoreProfile

Is SDL Renderer useless if I use opengl for drawing?

杀马特。学长 韩版系。学妹 提交于 2019-12-05 01:39:19
问题 I'm am learning SDL2, but I am also using the imgui library that is using OpenGL calls. From what I read on various blogs online, I can't easily mix SDL2 renderer and opengl calls; I either use one or the other. Most of the tutorials I've read use the renderer, so I do not quite understand how to use SDL2 without the renderer for drawing primitives, or drawing sprites. Take this for example: http://lazyfoo.net/tutorials/SDL/11_clip_rendering_and_sprite_sheets/index.php He creates the sdl

OpenGL 3: glBindVertexArray invalidates GL_ELEMENT_ARRAY_BUFFER

不打扰是莪最后的温柔 提交于 2019-12-04 11:23:11
问题 I was certain that if you bind a buffer via glBindBuffer() , you can safely assume that it stays bound, until the target is rebound through another call to glBindBuffer() . I was therefore quite surprised when I discovered that calling glBindVertexArray() sets the buffer bound to the GL_ELEMENT_ARRAY target to 0. Here's the minimal C++ sample code: GLuint buff; glGenBuffers(1, &buff); std::cout << "Buffer is " << buff << "\n"; glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, buff); GLuint vao;

Where can I find a good online OpenGL 3.0 tutorial that doesn't use any deprecated functionality? [closed]

99封情书 提交于 2019-12-04 07:35:06
问题 As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance. Closed 6 years ago . I just purchased the fifth edition of the OpenGL SuperBible. I'm very pleased that they've avoided using deprecated functionality but

OpenGL 3.x: Access violation when using vertex buffer object and glDrawElements(…)

可紊 提交于 2019-12-04 07:20:39
I have trouble rendering some geometry by using a vertex buffer object. I intend to draw a plane of points, so basically one vertex at every discrete position in my space. However, I cannot render that plane, as every time I call glDrawElements(...), application crashes returning an access violation exception. There must be some mistake while initialization, I guess. This is what I have so far: #define SPACE_X 512 #define SPACE_Z 512 typedef struct{ GLfloat x, y, z; // position GLfloat nx, ny, nz; // normals GLfloat r, g, b, a; // colors } Vertex; typedef struct{ GLuint i; // index } Index; //