I have a short question. Why does OpenGL come with its own datatypes for standard types like int, unsigned int, char, and so on? And do I have to use them instead of the bui
For example the OpenGL equivalent to
unsigned intisGLuint
No it isn't, and that's exactly why you should use OpenGL's data types when interfacing with OpenGL.
GLuint is not "equivalent" to unsigned int. GLuint is required to be 32 bits in size. It is always 32-bits in size. unsigned int might be 32-bits in size. It might be 64-bits. You don't know, and C isn't going to tell you (outside of sizeof).
These datatypes will be defined for each platform, and they may be defined differently for different platforms. You use them because, even if they are defined differently, they will always come out to the same sizes. The sizes that OpenGL APIs expect and require.