texture-mapping

Terrain Texturing

醉酒当歌 提交于 2019-12-13 20:44:42
问题 Is there a way to blend 2 textures in a gradient manner? ex. first textures starts at top and goes to the bottom, and second one starts bottom and goes top. I want to make them loose opacity like in a gradient to create a smooth connection. Is it possible? Maybe there is some other way to create such textures? The problem is that the difference in heights in my terrain does not look really well - in one square area, one triangle has a different texture than the other. 回答1: For every terrain

Problems Using Wavefront .obj's texture coordinates in Android OpenGL ES

不想你离开。 提交于 2019-12-13 15:55:20
问题 I'm writing an android app using openGL ES. I followed some online tutorials and managed to load up a textured cube using hard-coded vertices/indices/texture coordinates As a next step I wrote a parser for wavefront .obj files. I made a mock file using the vertices etc from the tutorial, which loads fine. However, when I use a file made using a 3d modelling package, all the textures get messed up Below is how I'm currently getting the texture coordinates: First I load all the texture

What is the best practice for loading multiple textures in three.js?

浪子不回头ぞ 提交于 2019-12-13 15:49:02
问题 Having recently switched to three.js as my renderer, I am now wanting to set up a system for mapping textures. However, I'm not too sure on what the best practice for this is. Here is my use case I have levels made of a lot of box geometries I need to be able to map a wide variety of textures to each face of each box geometry (grass, water, stone, rock etc.,) I would like to do this in the most efficient, performance conscious way as possible, so offloading this to the GPU would be best if

How to convert a 3D point on a plane to UV coordinates?

穿精又带淫゛_ 提交于 2019-12-13 12:07:14
问题 I have a 3d point, defined by [x0, y0, z0] . This point belongs to a plane, defined by [a, b, c, d] . normal = [a, b, c] , and ax + by + cz + d = 0 How can convert or map the 3d point to a pair of (u,v) coordinates ? This must be something really simple, but I can't figure it out. 回答1: First of all, you need to compute your u and v vectors. u and v shall be orthogonal to the normal of your plane, and orthogonal to each other. There is no unique way to define them, but a convenient and fast

Projective texture mapping in WebGL

这一生的挚爱 提交于 2019-12-13 07:29:56
问题 I wrote two simple WebGL demos which use a 512x512 image as texture. But the result is not what I want. I know the solution is to use projective texture mapping(or any other solutions?) but I have no idea how to implement it in my simple demos. Anyone can help? The results are as follows(both of them are incorrect): Codes of demos are here: https://github.com/jiazheng/WebGL-Learning/tree/master/texture note: Both the model and texture could not be modified in my case. 回答1: In order to get

gluCylinder with rotated texture

ぃ、小莉子 提交于 2019-12-13 04:55:49
问题 I want to draw a cylinder using gluQuadric and gluCylinder. This cylinder shall be textured. My draw code is the following: pTexture->Enable(); pTexture->Bind(); glPushMatrix(); glRotatef(-90.0f, 1.0f, 0.0f, 0.0f); gluQuadricOrientation(quadric, GLU_OUTSIDE); gluQuadricNormals(quadric, true); gluQuadricTexture(quadric, true); gluCylinder(quadric, getRadius(), getRadius(), getHeight(), 16, 1); glPopMatrix(); pTexture->Unbind(); pTexture->Disable(); My problem with this is now, that the texture

How to use a 3x3 2D transformation in a vertex/fragment shader (Metal)

梦想的初衷 提交于 2019-12-13 02:37:43
问题 I have a supposedly simple task, but apparently I still don't understand how projections work in shaders. I need to do a 2D perspective transformation on a texture quad (2 triangles), but visually it doesn't look correct (e.g. trapezoid is slightly higher or more stretched than what it is in the CPU version). I have this struct: struct VertexInOut { float4 position [[position]]; float3 warp0; float3 warp1; float3 warp2; float3 warp3; }; And in the vertex shader I do something like ( texCoords

FBO: render to texture, wrong texture mapping when drawing rendered texture

大兔子大兔子 提交于 2019-12-12 21:38:37
问题 I'm using OpengGL on a Mac OS X application to draw texture on a NSOpenGLView . The app is a movie player. It decodes movie frames into CVOpenGLTextureRef (which are OpenGL texture) and I draw them directly to the view using GL_QUAD . Everything works correctly. Below is the relevant part of the code. // "image" is the CVOpenGLTextureRef containing the movie frame as texture GLenum textureTarget = CVOpenGLTextureGetTarget(image); GLuint textureName = CVOpenGLTextureGetName(image); glEnable

Using texture image with alpha makes mesh 'see through'

孤者浪人 提交于 2019-12-12 20:08:48
问题 I am rendering an obj file in OpenGL ES 2.0 on Android with Back-Culling enabled. Only some part (the necklace around the neck) of the texture image actually has alpha. When rendering only the mesh, it looks fine : However, on enabling the texture, I am able to see through the mesh onto the other side. You can see below that the right hand which is behind the body also becomes visible. Any ideas what might be going wrong ? Edit: I have tried the following : Enabling/Disabling back face

Switching to glTexImage3D from glTexStorage3D

﹥>﹥吖頭↗ 提交于 2019-12-12 17:56:33
问题 glBindTexture(GL_TEXTURE_2D_ARRAY, texture_id); glTexStorage3D(GL_TEXTURE_2D_ARRAY, 1, // No mipmaps GL_RGBA8, // Internal format width, height, // width,height 1 // Number of layers ); glTexSubImage3D(GL_TEXTURE_2D_ARRAY, 0, // Mipmap number 0, 0, 0, // xoffset, yoffset, zoffset width, height, 1, // width, height, depth GL_RGBA8, // format GL_UNSIGNED_BYTE, // type image); // pointer to data For testing I only create an array of length 1. I am currently using OpenGL 4.3 but I want to switch