Why isn't GLsizei defined as unsigned?

爱⌒轻易说出口 提交于 2019-12-10 14:37:30

问题


I was looking up the typedef of GLsizei for the OpenGL ES 1.1 implementation on iOS, and was surprised to find that it was defined as an int. Some quick googling showed that this is normal. (Including on normal OpenGL.)

I was expecting that it would be defined as an unsigned int or size_t. Why is it defined as just a vanilla int?


回答1:


It seems unlikely to be a problem, unless you have any 4GB data structures kicking around.

Here's someone's answer: http://oss.sgi.com/archives/ogl-sample/2005-07/msg00003.html

Quote:

(1) Arithmetic on unsigned values in C doesn't always yield intuitively
correct results (e.g. width1-width2 is positive when width1<width2).
Compilers offer varying degrees of diagnosis when unsigned ints appear
to be misused.  Making sizei a signed type eliminates many sources of
semantic error and some irrelevant diagnostics from the compilers.  (At
the cost of reducing the range of sizei, of course, but for the places
sizei is used that's rarely a problem.)

(2) Some languages that support OpenGL bindings lack (lacked? not sure
about present versions of Fortran) unsigned types, so by sticking to
signed types as much as possible there would be fewer problems using
OpenGL in those languages.

Both explanations seem plausible - I've run into 1) on a number of occasions myself when stupidly using NSUInteger for a loop counter (hint: don't do that, especially when counting backwards down to zero).



来源:https://stackoverflow.com/questions/8996743/why-isnt-glsizei-defined-as-unsigned

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!