问题
Using Googles AR Core Depth API, the current depth information is stored in a RGB565 Texture.
If I want to calculate the current depth in a shader I use the following code (which is mostly stolen from Google Depth Lab):
#define ARCORE_DEPTH_SCALE 0.001 // mm to m
#define ARCORE_MAX_DEPTH_MM 8191.0
#define ARCORE_FLOAT_TO_5BITS 31 // (0.0, 1.0) -> (0, 31)
#define ARCORE_FLOAT_TO_6BITS 63 // (0.0, 1.0) -> (0, 63)
#define ARCORE_RGB565_RED_SHIFT 2048 // left shift 11 bits
#define ARCORE_RGB565_GREEN_SHIFT 32 // left shift 5 bits
float ArCoreDepth_GetMeters(float2 uv)
{
// The depth texture uses TextureFormat.RGB565.
float4 rawDepth = tex2Dlod(CurrentDepthTexture, float4(uv, 0, 0));
float depth = (rawDepth.r * ARCORE_FLOAT_TO_5BITS * ARCORE_RGB565_RED_SHIFT)
+ (rawDepth.g * ARCORE_FLOAT_TO_6BITS * ARCORE_RGB565_GREEN_SHIFT)
+ (rawDepth.b * ARCORE_FLOAT_TO_5BITS);
depth = min(depth, ARCORE_MAX_DEPTH_MM);
depth *= ARCORE_DEPTH_SCALE;
return depth;
}
However, if I wanted to store the current depth information as grayscale PNG on the CPU side, how can I do this?
The result I get when using Texture2D.EncodeToPNG()
doesn't seem to work with this color format.
来源:https://stackoverflow.com/questions/65286213/how-to-convert-a-rgb565-texture-to-png-depth-image