opengl-3

Creating an OpenGL 3.2/3.x context in SDL 1.3

故事扮演 提交于 2019-12-13 11:36:53
问题 I'm facing a problem where SDL says it does not support OpenGL 3.x contexts. I am trying to follow this tutorial: Creating a Cross Platform OpenGL 3.2 Context in SDL (C / SDL). I am using GLEW in this case, but I couldn't get gl3.h to work with this either. This is the code I ended up with: #include <glew.h> #include <SDL.h> int Testing::init() { if(SDL_Init(SDL_INIT_EVERYTHING) < 0) { DEBUGLINE("Error initializing SDL."); printSDLError(); system("pause"); return 1; // Error } //Request

DDA Line Drawing Algorithm has errors

痞子三分冷 提交于 2019-12-13 09:37:36
问题 Why am I getting an error saying 'setPixel not defined' with this code? #include <windows.h> #include <stdio.h> #include <math.h> #include <stdlib.h> #include<GL/glut.h> inline int round(const float a) { return int (a+0.5); } void init(void) { glClearColor(0.0f,0.0f,1.0f,1.0f); gluOrtho2D(0.0,200.0,0.0,200.0); glMatrixMode(GL_PROJECTION); } void LineSegment(int xa, int ya,int xb,int yb) { glClear(GL_COLOR_BUFFER_BIT); glColor3f(1.0f,0.0f,0.0f); printf("Enter the initial value"); scanf("%d%d",

OpenGL viewport distortion

好久不见. 提交于 2019-12-13 06:16:17
问题 I'm still a beginner in OpenGL. I'm trying to draw a perfect square on the screen size 1280 by 720 using shaders. I'm using OpenGL core profile, version 3.3. I was stuck with this when I tried to draw a square in 1280 by 720 After some time of searching, I realized that the size is distort by the viewport size, after changing the viewport size to 720 by 720, I got this. In the legacy OpenGL, they have a solution to fix this, but now it's deprecated in the Core profile. Issue: How can I draw a

Rotating a cube in modern opengl… looks strange

孤人 提交于 2019-12-13 03:45:11
问题 I'm somewhat lost, really lost. I'm trying to rotate a cube (just around the y-axis for now) and this is the (ugly and wrong) outcome: This is the code to rotate the matrix: def rotate(axis: Vector3, angle: Float): Unit = { val cosAngle: Float = Math.cos(angle).toFloat val sinAngle: Float = Math.sin(angle).toFloat val oneMinusCosAngle: Float = 1.0f - cosAngle val xy: Float = axis.x * axis.y val xz: Float = axis.x * axis.z val yz: Float = axis.y * axis.z val xs: Float = axis.x * sinAngle val

Switching to glTexImage3D from glTexStorage3D

﹥>﹥吖頭↗ 提交于 2019-12-12 17:56:33
问题 glBindTexture(GL_TEXTURE_2D_ARRAY, texture_id); glTexStorage3D(GL_TEXTURE_2D_ARRAY, 1, // No mipmaps GL_RGBA8, // Internal format width, height, // width,height 1 // Number of layers ); glTexSubImage3D(GL_TEXTURE_2D_ARRAY, 0, // Mipmap number 0, 0, 0, // xoffset, yoffset, zoffset width, height, 1, // width, height, depth GL_RGBA8, // format GL_UNSIGNED_BYTE, // type image); // pointer to data For testing I only create an array of length 1. I am currently using OpenGL 4.3 but I want to switch

.obj : fatal error LNK1107: invalid or corrupt file: cannot read at 0x6592

╄→гoц情女王★ 提交于 2019-12-12 08:48:35
问题 I am trying to load an .obj model into my c++ opengl 3 code but for some reason it gives me this error : 1>Linking... 1>.\bunny.obj : fatal error LNK1107: invalid or corrupt file: cannot read at 0x6592 I tried to search for similar errors, but there were about .dll's or .lib's. Can you please help me out with this issue. I have also tried with different obj models but it always gives me this error. 回答1: You are trying to load your object model with a C++ linker (probably you have just added

Color passed to shaders doesn't work

青春壹個敷衍的年華 提交于 2019-12-12 04:34:37
问题 I am pretty new to OpenGL and have a simple program that I am messing with at the moment. The issue that I am running into is when I pass an array to represent the color of points, the color just ends up being black. It works fine if I just explicitly define a color in the fragment shader, and if I switch the indices of the data in the VAO, the triangle is moved to the points that are the colors. I tried using API trace and got the following: 6872: message: major api error 1282: GL_INVALID

Drawing a line using individual pixels in OpenGl core

你。 提交于 2019-12-12 03:55:11
问题 I am trying to implement a line drawing algorithm in OpenGl. I have learnt the basics of using OpenGl from learnopengl. In the line drawing algorithm I need to set the individual pixel itself. I don't understand how to use the OpenGl at pixel level. I tried searching for the implementation bresenham's line algorithm in opengl, everywhere the implementation uses the function glDrawPixels which is not supported in OpenGl3.3. Is there anything that I'm missing in OpenGl3.3? 回答1: The point of

near/far planes and z in orthographic rasterization

爱⌒轻易说出口 提交于 2019-12-12 02:38:37
问题 Ive coded tons of shaders but Ive stumbled into something I never realized before. I needed a vertex + fragment shader with a simple orthographic projection, with no depth test. The camera is Z-aligned with the origin. I disabled GL_DEPTH_TEST, and masked depth writes. It was so simple infact, that I decided I didnt even need a projection matrix. In my complete ignorance and ingenuity, I thought that for any triangle vertex, the vertex shader would simply pass x,y(,z = <whatever> ,w = 1) to

GLSL, default value for output color

风格不统一 提交于 2019-12-11 23:50:09
问题 Which is the default value for the output color in GLSL in case you dont set it? #version 330 uniform sampler2DRect colorTex; uniform vec3 backgroundColor; out vec4 outputColor; void main(void) { vec4 frontColor = texture(colorTex, gl_FragCoord.xy); outputColor.rgb = frontColor + backgroundColor * frontColor.a; } Is it (0, 0, 0, 1)? Ps: that code belongs to the old GL, trying to use it with the GL3, I get the following error error C7011: implicit cast from "vec4" to "vec3" I am right to