How do you pack one 32bit int Into 4, 8bit ints in glsl / webgl?

前端 未结 3 1851
长情又很酷
长情又很酷 2020-11-28 08:33

I\'m looking to parallelize some complex math, and webgl looks like the perfect way to do it. The problem is, you can only read 8 bit integers from textures. I would ideally

3条回答
  •  粉色の甜心
    2020-11-28 08:59

    Everyone is absolutely correct in how to handle something like this in WebGl, but I wanted to share a trick for getting the values in and out.

    Assuming you want to do some comparison on two values that fit in 16 bits:

    // Generate a list of random 16bit integers
    let data16bit = new Uint16Array(1000000);
    for(let i=0; i < data16bit.length; i+=2){
        data16bit[i]   = Math.random()*(2**16);
        data16bit[i+1] = Math.random()*(2**16);
    }
    // Read them one byte at a time, for writing to 
    // WebGL
    let texture = new Uint8Array(data16bit.buffer);
    

    Now when you get your values in your fragment shader, you can pick up the numbers for manipulation:

    vec4 here = texture2D(u_image, v_texCoord);
    // Read the "red" byte and the "green" byte together (as a single thing) 
    // as well as the "blue" byte and the "alpha" byte together as a single
    // thing
    vec2 a = here.rg;
    vec2 b = here.ba;
    // now compare the things
    if(a == b){
        here.a = 1;
    }
    else{
        here.a = 0;
    }
    // return the boolean value
    gl_FragColor = here;
    

    The point is just a reminder that you can treat the same block of JavaScript memory as different sizes: Uint16Array, and Uint8Array (rather than trying to do bit shifting and breaking it up).

    Update

    In response to requests for more detail, that code is very close to a cut/paste directly from this code and explanation.

    The exact use of this can be found on the corresponding samples on GitLab (two parts of the same file)

提交回复
热议问题