textures

Three js - Cloning a shader and changing uniform values

守給你的承諾、 提交于 2019-12-04 17:59:31
问题 I'm working on creating a shader to generate terrain with shadows. My starting point is to clone the lambert shader and use a ShaderMaterial to eventually customise it with my own script. The standard method works well: var material = new MeshLambertMaterial({map:THREE.ImageUtils.loadTexture('images/texture.jpg')}); var mesh = new THREE.Mesh(geometry, material); etc The result: However I'd like to use the lambert material as a base and work on top of it, so I tried this: var lambertShader =

sf::Texture applied in a wrong way

笑着哭i 提交于 2019-12-04 17:59:24
问题 In my 2D isometric engine, I have the following classes : `maps(variable)/layers(variable)/cubes(variable)/sides(6)/points(4)/coordinates(3)` Each sides contains 4 points (1 point = 1 coordinate (x, y, z)). Each cubes contains 6 sides. I can create a map with the size i want with cubes(same, the size i want). Folers: assets/numTexture/numLight.png I calcule with numTexture and numLight a number wich is the textureNumberEntry(i loaded all numLight.png(textures) in an array). But texturing

Image2D in compute shader

馋奶兔 提交于 2019-12-04 17:54:38
I want to use image2D as 2D storage for vertices which will be modified by compute shader but things doesnt work. Create textures: glGenTextures(1, &HeightMap); glBindTexture(GL_TEXTURE_2D, HeightMap); glTexImage2D(GL_TEXTURE_2D, 0,GL_RGBA32F, 513, 513, 0,GL_RGBA32F, GL_UNSIGNED_BYTE, 0); Use and dispatch compute shader: glUseProgram(ComputeProgram); glActiveTexture(GL_TEXTURE0); glBindImageTexture(0, HeightMap, 0, GL_FALSE, 0, GL_READ_WRITE, GL_RGBA32F); glDispatchCompute(1, 1, 1 ); glMemoryBarrier( GL_ALL_BARRIER_BITS ); And compute shader: #version 430 core layout( std430, binding=1 )

Mapping a stereographic projection to the inside of a sphere in ThreeJS

时光总嘲笑我的痴心妄想 提交于 2019-12-04 17:24:09
When it comes to 3D animation, there are a lot of terms and concepts that I'm not familiar with (maybe a secondary question to append to this one: what are some good books to get familiar with the concepts?). I don't know what a "UV" is (in the context of 3D rendering) and I'm not familiar with what tools exist for mapping pixels on an image to points on a mesh. I have the following image being produced by a 360-degree camera (it's actually the output of an HTML video element): I want the center of this image to be the "top" of the sphere, and any radius of the circle in this image to be an

Super blurry textures - XNA

妖精的绣舞 提交于 2019-12-04 16:58:50
I seem to be getting really blurry textures when I look at my textures up close. I am creating a minecraft-like terrain thing and I would like the textures to be pixelated - like it is made rather than XNA try to smooth it out for me. Here is how it appears: http://s1100.photobucket.com/albums/g420/darestium/?action=view&current=bluryminecraftterrain.png Any suggestions would be much appeciated. It's not related to anti-aliasing... it's related to how the hardware samples the texels in the texture. The default filter in XNA is usually Linear, but to get those "blocky" looking textures you must

read and write integer 1-channel texture opengl

折月煮酒 提交于 2019-12-04 15:24:15
I want to: create a read and writable 1-channel texture that contains integers. using a shader, write integer "I" to the texture. use the texture as a source, sample it and compare if the sample is equal to the integer I. All this with core profile 3.3. This is what I've got so far: I create the texture like so: glTexImage2D(GL_TEXTURE_2D, 0, GL_R8, width, height, 0, GL_RED, GL_INT, (java.nio.ByteBuffer) null); I've Also tried GL_R8I and GL_RED_INTEGER, but that won't work. I bind this texture as my FBO, set blendfunc to (GL_ONE, GL_ONE) (additive) and render 1 quad using a shader that simply

Three.js materialLoader doesn't load embedded texture image

自作多情 提交于 2019-12-04 15:14:53
I'm exporting a three.js material using material.toJSON() provided method, this is the result: { "metadata":{"version":4.5,"type":"Material","generator":"Material.toJSON"}, "uuid":"8E6F9A32-1952-4E12-A099-632637DBD732", "type":"MeshStandardMaterial", "color":11141120, "roughness":1, "metalness":0.5, "emissive":0, "map":"876D3309-43AD-4EEE-946F-A8AE8BA53C9E", "transparent":true,"depthFunc":3,"depthTest":true,"depthWrite":true, "textures":[ { "uuid":"876D3309-43AD-4EEE-946F-A8AE8BA53C9E", "name":"", "mapping":300, "repeat":[1,1], "offset":[0,0], "center":[0,0], "rotation":0, "wrap":[1001,1001],

OpenGL texture tilted

我们两清 提交于 2019-12-04 14:21:47
When I load a texture in OpenGL and this has one ( GL_ALPHA ) or three components per pixel ( GL_RGB ), the texture appears tilted. What makes this happen? As additional detail, the relation width/height seems to affect. For example, an image of 1366x768(683/384) appears tilted while an image of 1920x1080(16/9) is mapped correctly. This is probably a padding/alignment issue. GL, by default, expects rows of pixels to be padded to a multiple of 4 bytes. A 1366 wide texture with 1 byte or 3 byte wide pixels, will not be naturally 4 byte aligned. Possible fixes for this are: Tell GL how your

Copy ffmpeg d3dva texture resource to shared rendering texture

安稳与你 提交于 2019-12-04 13:50:43
I'm using ffmpeg to decode video via d3dva based on this example https://github.com/FFmpeg/FFmpeg/blob/master/doc/examples/hw_decode.c . I'm able to succesfully decode video. What I need to do next is to render decoded NV12 frame. I have created directx rendering texture based on this example https://github.com/balapradeepswork/D3D11NV12Rendering and set it as shared. D3D11_TEXTURE2D_DESC texDesc; texDesc.Format = DXGI_FORMAT_NV12; // Pixel format texDesc.Width = width; // Width of the video frames texDesc.Height = height; // Height of the video frames texDesc.ArraySize = 1; // Number of

How to visualize a depth texture in OpenGL?

只愿长相守 提交于 2019-12-04 13:47:44
问题 I'm working on a shadow mapping algorithm, and I'd like to debug the depth map that it's generating on its first pass. However, depth textures don't seem to render properly to the viewport. Is there any easy way to display a depth texture as a greyscale image, preferably without using a shader? 回答1: You may need to change the depth texture parameters to display it as greyscale levels : glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_COMPARE_MODE, GL_NONE ) glTexParameteri( GL_TEXTURE_2D, GL_DEPTH