Converting from signed char to unsigned char and back again?

后端 未结 5 891
清酒与你
清酒与你 2020-11-28 19:22

I\'m working with JNI and have an array of type jbyte, where jbyte is represented as an signed char i.e. ranging from -128 to 127. The jbytes represent image pixels. For ima

5条回答
  •  一生所求
    2020-11-28 19:59

    There are two ways to interpret the input data; either -128 is the lowest value, and 127 is the highest (i.e. true signed data), or 0 is the lowest value, 127 is somewhere in the middle, and the next "higher" number is -128, with -1 being the "highest" value (that is, the most significant bit already got misinterpreted as a sign bit in a two's complement notation.

    Assuming you mean the latter, the formally correct way is

    signed char in = ...
    unsigned char out = (in < 0)?(in + 256):in;
    

    which at least gcc properly recognizes as a no-op.

提交回复
热议问题