Determine optimization level in preprocessor?

十年热恋 提交于 2019-12-18 04:31:46

问题


-Og is a relatively new optimization option that is intended to improve the debugging experience while apply optimizations. If a user selects -Og, then I'd like my source files to activate alternate code paths to enhance the debugging experience. GCC offers the __OPTIMIZE__ preprocessor macro, but its only set to 1 when optimizations are in effect.

Is there a way to learn the optimization level, like -O1, -O3 or -Og, for use with the preprocessor?


回答1:


I believe this is not possible to know directly the optimization level used to compile the software as this is not in the list of defined preprocessor symbols

You could rely on -DNDEBUG (no debug) which is used to disable assertions in release code and enable your "debug" code path in this case.

However, I believe a better thing to do is having a system wide set of symbols local to your project and let the user choose what to use explicitly.:

  • MYPROJECT_DNDEBUG
  • MYPROJECT_OPTIMIZE
  • MYPROJECT_OPTIMIZE_AGGRESSIVELY

This makes debugging or the differences of behavior between release/debug much easier as you can incrementally turn on/off the different behaviors.




回答2:


I don't know if this is clever hack, but it is a hack.

$ gcc -Xpreprocessor -dM -E - < /dev/null > 1
$ gcc -Xpreprocessor -dM -O -E - < /dev/null > 2
$ diff 1 2
53a54
> #define __OPTIMIZE__ 1
68a70
> #define _FORTIFY_SOURCE 2
154d155
< #define __NO_INLINE__ 1

clang didn't produce the FORTIFY one.



来源:https://stackoverflow.com/questions/31718637/determine-optimization-level-in-preprocessor

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!