How can I bind a pixel array of integer colors to a texture using the Android NDK?

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-23 03:47:18

问题


I'm trying to port my Java OpenGL code on Android to the Native SDK and I need an IntBuffer implementation.

Basically what I do in Java to load up an arbitrary integer RGBA pixel color array into a texture is:

    // pixel array
    pixelIntArray = new int[width * height];

    bb = ByteBuffer.allocateDirect(pixelIntArray.length * 4);
    bb.order(ByteOrder.nativeOrder());

    // native buffer
    pixelBuffer = bb.asIntBuffer();

    // push integer array of pixels into buffer
    pixelBuffer.put(pixelIntArray);
    pixelBuffer.position(0);

    // bind buffer to texture
    gl.glTexImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_RGBA, width, height, 0,
                GL10.GL_RGBA, GL10.GL_UNSIGNED_BYTE, pixelBuffer);

in C so that I can push a texture to a quad using a buffer.

Currently I'm just binding it to my pixelIntArray in C and the texture comes out distorted.

Basically I need to be able to bind a series of colors in an integer pixel array to a texture through a buffer similar to Java's NIO class.


回答1:


I think this may be how to solve it:


Initialize


    int length = width * height;

    int* pixels = (int*) malloc(sizeof(int) * length);
    unsigned char * buffer = (unsigned char*) malloc(sizeof(char) * length * 4);

Copy to buffer


    int i, j;

    for (i = 0; i < length; i++) {

        j = 4 * i;

        buffer[j] = (char) (pixels[i] & 0xFF);
        buffer[j + 1] = (char) (pixels[i] >> 8 & 0xFF);
        buffer[j + 2] = (char) (pixels[i] >> 16 & 0xFF);
        buffer[j + 3] = (char) (pixels[i] >> 24 & 0xFF);

    }

Source


From: http://www.c-sharpcorner.com/Forums/Thread/32972/



来源:https://stackoverflow.com/questions/14431693/how-can-i-bind-a-pixel-array-of-integer-colors-to-a-texture-using-the-android-nd

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!