Is there a way to disable implicit casts from UInt32 to char?

后端 未结 4 1727
隐瞒了意图╮
隐瞒了意图╮ 2021-01-18 03:03

I am working on code that takes as input a ton of ascii text defined by specific protocol. The original author interpreted \"string(1)\" datatypes in the original protocol

4条回答
  •  渐次进展
    2021-01-18 03:49

    As a "fix once" solution I think James Michael Hare's FxCop solution is the easiest.

    To keep it from being a problem in the future refactoring to use a custom datatype instead of char so you can define the exact operations you want available may be a good idea.

提交回复
热议问题