textures

Loading an image from a C source file, exported from GIMP

拥有回忆 提交于 2019-12-10 20:15:52
问题 I'm trying to load an openGL texture for a game. The texture is an image that has been exported as a .C source file from GIMP. When I #include this file in my project (using Visual C++ 2010 Ultimate), I get a compiler error saying fatal error C1091: compiler limit: string exceeds 65535 bytes in length Is there any workaround ? The reason I wanted to export the image as a C header file was so that the program compiles with the image, and I dont have to provide raw image files along with the

OpenGL Renders texture with different color than original image?

我与影子孤独终老i 提交于 2019-12-10 20:03:54
问题 Why are these not displaying the same colors? Original Image: Plane with above Image as texture: WTF is happening? The original image is 100x100 pixels, made in paint and saved as a 24 bit bitmap. Here is my opengl initialization code: _hdc = GetDC(_hwnd); PIXELFORMATDESCRIPTOR pfd; ZeroMemory( &pfd, sizeof( pfd ) ); pfd.nSize = sizeof( pfd ); pfd.nVersion = 1; pfd.dwFlags = PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER; pfd.iPixelType = PFD_TYPE_RGBA; pfd.cColorBits = 24; pfd

OpenGL ES (iPhone) multi-texturing (2D) code

前提是你 提交于 2019-12-10 19:13:23
问题 I have a texture from this PNG: And another from this PNG: They both have the same blend function: glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA); I want to see them on one single polygon first. I just couldn't find a simple example of this. Draw them to different polygons works perfect, but I just cannot "merge" them into one texture. Any working sample codelines would be appreciated well. Second problem is to make the specular map's alpha variable. I can see that I have to texture combine

Reading a openGL ES texture to a raw array

末鹿安然 提交于 2019-12-10 18:55:34
问题 I'm trying to get the bytes from a RGBA texture in openGL ES. I think I'm trying to imitate glGetTexImage from vanilla openGL. As is right now, the pixels are all nil, from the call to glClear , not the texture data I'm looking for. This is a method in a category extending SPTexture from Sparrow Framework. -(NSMutableData *) getPixelsFromTexture { GLsizei width = (GLsizei) self.width; GLsizei height = (GLsizei) self.height; GLint oldBind = 0; glGetIntegerv(GL_TEXTURE_BINDING_2D, &oldBind);

Editing a Cubemap Skybox from remote image

陌路散爱 提交于 2019-12-10 16:55:49
问题 I need to download an image from a server, then transform it in a Cubemap and finally put this CubeMap in my Skybox. I work with C#. I came up with this code : public string url = "image/url.jpg"; void Update() { // When trigger, we start the process if (Input.GetKeyDown("f")) { // start Coroutine to handle the WWW asynchronous process StartCoroutine("setImage"); } } IEnumerator setImage () { Texture2D tex; tex = new Texture2D(2048, 2048, TextureFormat.RGBA32, false); WWW www = new WWW(url);

Failing to map a simple unsigned byte rgb texture to a quad:

大城市里の小女人 提交于 2019-12-10 15:38:47
问题 I have a very simple program that maps a dummy red texture to a quad. Here is the texture definition in C++: struct DummyRGB8Texture2d { uint8_t data[3*4]; int width; int height; }; DummyRGB8Texture2d myTexture { { 255,0,0, 255,0,0, 255,0,0, 255,0,0 }, 2u, 2u }; This is how I setup the texture: void SetupTexture() { // allocate a texture on the default texture unit (GL_TEXTURE0): GL_CHECK(glCreateTextures(GL_TEXTURE_2D, 1, &m_texture)); // allocate texture: GL_CHECK(glTextureStorage2D(m

glPopMatrix() yells “unsupported texture format in setup_hardware_state”

南笙酒味 提交于 2019-12-10 13:43:01
问题 I'm trying to make some optimizations in a private video player for Linux aiming to improve performance because playing MP4 files are heavy on the CPU , since the video frames are encoded in YV12 and OpenGL doesn't provide a native way to display this format. Right now there's a code that runs on the CPU to converts YV12 to RGB before the image is sent to the GPU for display, and this consumes 100% of CPU processing. I'm currently investigating how to decode YV12 frames without having to

Using different texture types in same texture unit at the same time in shader

╄→гoц情女王★ 提交于 2019-12-10 13:12:04
问题 I came across a nasty problem in my program when i tried to use the same texture unit (number 0) for different texture types (i.e. a normal 2D texture and a cube map) in my shader. It appeared so that the GL issues a 502H (Invalid Operation) after the first glDrawArrays call. In my application code i load up the textures to different texture targets: void setup_textures() { unsigned int width, height; int components; unsigned int format; float param[8]; vector<unsigned char> pngData; GLenum

Animated GIF as texture in THREE.js

只愿长相守 提交于 2019-12-10 13:05:08
问题 I'm looking for a way to use a GIF animation as texture in THREE.js. I am currently able to load a texture (even GIF format), but it doesn't play its animation. Is there any way to do it? I found some links like these: https://github.com/JordiRos/GLGif http://stemkoski.github.io/Three.js/Texture-Animation.html But I need to play GIF animation as a texture, not in a Canvas. 回答1: What you're seeing isn't an animated GIF as a texture. The sites you linked use libraries to render each individual

SFML (32-bit VS12) - Unhandled exception at 0x701ADEF8 (msvcr110.dll) in SFML.exe: 0xC0000005: Access violation reading location 0x0526. LoadFromFile

。_饼干妹妹 提交于 2019-12-10 11:45:09
问题 The following code gives me the uncaught exception (specifically txtr.loadFromFile("C:/Users/kidz/Documents/Visual Studio 2012/Projects/SFML/Debug/chessboard.gif"); ): "Unhandled exception at 0x701ADEF8 (msvcr110.dll) in SFML.exe: 0xC0000005: Access violation reading location 0x05260000." int _tmain(int argc, wchar_t* argv[]) { sf::RenderWindow window(sf::VideoMode(512, 512), "ChessPlusPlus", sf::Style::Close); sf::Sprite chessboard; sf::Texture txtr; txtr.loadFromFile("C:/Users/kidz