Image2D in compute shader

馋奶兔 提交于 2019-12-04 17:54:38

Ok i found solution. This is how you use image2D for read and write data with compute shader:

Create texture:

glGenTextures(1, &HeightMap);
glBindTexture(GL_TEXTURE_2D, HeightMap);
glTexImage2D(GL_TEXTURE_2D, 0,GL_RGBA32F, 513, 513, 0,GL_RGBA, GL_FLOAT, 0);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glGenerateMipmap(GL_TEXTURE_2D);

Dispatch compute shader:

glBindBufferBase( GL_SHADER_STORAGE_BUFFER, 1, VerticesBuffer );

glBindImageTexture(0, HeightMap, 0, GL_FALSE, 0, GL_READ_WRITE, GL_RGBA32F);
glUseProgram(ComputeProgram);

glUniform1i(glGetUniformLocation(ComputeProgram, "HeightMap"), 0);


glDispatchCompute(1, 1, 1 );
glMemoryBarrier( GL_ALL_BARRIER_BITS );

Example compute shader:

    #version 430 core
layout( std430, binding=1 ) buffer VertBuffer
    {
    vec4 Positions[ ]; 
    };
layout( local_size_x = 1, local_size_y = 1, local_size_z = 1) in;
layout (rgba32f)  uniform image2D HeightMap;
void main (void)
{
    ivec2 pos=ivec2(0,0);
    imageStore(HeightMap, pos,vec4(10,0,0,1));
    Positions[0].xyzw=imageLoad(HeightMap, pos).rgba;
}
glTexImage2D(GL_TEXTURE_2D, 0,GL_RGBA32F, 513, 513, 0,GL_RGBA32F, GL_UNSIGNED_BYTE, 0);

Always check your OpenGL errors. This line fails because GL_RGBA32F is not a legal pixel transfer format. The pixel transfer format only specifies the number of components you're using. It should be GL_RGBA. This causes an OpenGL error.

Yes, I know you're not actually transfering pixel data. But OpenGL requires that the pixel transfer parameters be legitimate even if you're not actually doing a pixel transfer.

Also, the type (the next parameter) should be GL_FLOAT, not GL_UNSIGNED_BYTE.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!