I\'ve been looking at the command line generated by Visual Studio, and for one of my project it defines two symbols: _UNICODE and UNICODE. Now if
In a nutshell,
UNICODE is used by Windows headers,
whereas
_UNICODE is used by C-runtime/MFC headers.
Raymond Chen explains it here: TEXT vs. _TEXT vs. _T, and UNICODE vs. _UNICODE:
The plain versions without the underscore affect the character set the Windows header files treat as default. So if you define
UNICODE, thenGetWindowTextwill map toGetWindowTextWinstead ofGetWindowTextA, for example. Similarly, theTEXTmacro will map toL"..."instead of"...".The versions with the underscore affect the character set the C runtime header files treat as default. So if you define
_UNICODE, then_tcslenwill map towcsleninstead ofstrlen, for example. Similarly, the_TEXTmacro will map toL"..."instead of"...".
Looking into Windows SDK you will find things like this:
#ifdef _UNICODE
#ifndef UNICODE
#define UNICODE
#endif
#endif
Compiler vendors have to prefix the identifiers in their header files with an underscore to prevent them from colliding with your identifiers. So <tchar.h>, a compiler header file, uses _UNICODE. The Windows SDK header files are compiler agnostic, and stone-cold old, it uses UNICODE without the underscore. You'll have to define both.