问题
In THREE you can specify a DataTexture with a given data type and format. My shader pipeline normalizes the raw values based on a few user-controlled uniforms.
In the case of a Float32Array, it is very simple:
data = new Float32Array(...)
texture = new THREE.DataTexture(data, columns, rows, THREE.LuminanceFormat, THREE.FloatType)
And, in the shader, the swizzled values have non-normalized values. However, if I use:
data = new Uint8Array(...)
texture = new THREE.DataTexture(data, columns, rows, THREE.LuminanceFormat, THREE.UnsignedByteType);
Then the texture is normalized between 0.0 and 1.0 as an input to the pipeline. Not what I was expecting. Is there a way to prevent this behavior?
Here is an example jsfiddle demonstrating a quick test of what is unexpected (at least for me): http://jsfiddle.net/VsWb9/3796/
three.js r.71
回答1:
For future reference, this is not currently possible in WebGL. It requires the use of GL_RED_INTEGER and the unsupported usampler2d.
This comment from the internalformat of Texture also describes the issue in GL for the internal formats.
For that matter, the format of GL_LUMINANCE says that you're passing either floating-point data or normalized integer data (the type says that it's normalized integer data). Of course, since there's no GL_LUMINANCE_INTEGER (which is how you say that you're passing integer data, to be used with integer internal formats), you can't really use luminance data like this.
Use GL_RED_INTEGER for the format and GL_R8UI for the internal format if you really want 8-bit unsigned integers in your texture. Note that integer texture support requires OpenGL 3.x-class hardware.
That being said, you cannot use sampler2D with an integer texture. If you are using a texture that uses an unsigned integer texture format, you must use usampler2D.
来源:https://stackoverflow.com/questions/30990593/prevent-datatexture-value-normalization-in-three