Why is #define bad and what is the proper substitute?

后端 未结 6 1149
情深已故
情深已故 2020-11-30 12:37
#define dItemName        L\"CellPhone\"
6条回答
  •  陌清茗
    陌清茗 (楼主)
    2020-11-30 13:24

    Amusingly I could not find a single question pointing all the disadvantages, even the subject has certainly been discussed before.

    First of all, not that in C (not C++) this is the way to declare a constant. This also explains why so many C++ developers still use it: when they come from C background or have been taught by / learned from people with C background, they tend to reproduce this C-ish behavior.

    In C++, however, we have superior facilities.

    #define does not define a constant, it defines a macro

    1. A macro knows no scope
    2. A macro is not type safe

    A macro knows no scope:

    They are preprocessing facilities: the preprocessor is not aware of the rules of the underlying language (whether asm, C or C++) and will always expand the symbols it has in stock with no regard for scope.

    For this reason, it is usually recommended to use a specific set of symbols to set macros apart. People generally use ALL_CAPS symbols, though you need to remember that:

    • they should not contain two consecutive underscores
    • they should not begin by an underscore

    in order to be compliant with the C++ standard.

    A macro is not type safe.

    As I said, the preprocessor ignores the underlying language rules, therefore the following does not strike it as strange:

    #define FOO "foo"
    
    int main(int argc, char* argv[])
    {
      if (FOO) { ... }
    
      return 0;
    }
    

    On the other hand, using a proper type would prevent this unintentional mistake:

    std::string const Foo = "foo";
    

    Conclusion ?

    You can use a #define if you wish, it's just you doing the extra work instead of the compiler, but that's your call. Personally: I am lazy :)

提交回复
热议问题