I have a short question. Why does OpenGL come with its own datatypes for standard types like int, unsigned int, char, and so on? And do I have to use them instead of the bui
I'm not an expert of OpenGL, but usually frameworks/platforms such as OpenGL, Qt, etc. define their own datatypes so that the meaning and the capacity of the underlying datatype remains the same across different OSes. Usually this behavior is obtained using C/C++ preprocessor macros, but for what concerns GLuint, it seems to be just a typedef in gl.h:
typedef unsigned int GLuint;
So the answer is yes. You should use the framework's datatypes to ensure a good portability of your code within that framework across OSes.