Why does C# && and || operators work the way they do?

前端 未结 7 1988
余生分开走
余生分开走 2020-12-06 14:25

here is a tl;dr

I come from a C++ background. && is suppose to check if left side is true and right side is true. what does & have anything to do with th

7条回答
  •  醉梦人生
    2020-12-06 14:47

    As I understand you would prefer a && b being defined as something like ((bool)a)&&((bool)b) instead of what C# uses.

    But I think this kind of operator overloading was introduced to support tri-state bools such as bool? and DBBool.

    Let's define a few examples for such a type:

    With no short circuiting possible:

    null && true == null
    null && false == false
    null || true == true
    null || false == null
    

    With short circuiting possible:

    false && null == false
    true || null == true
    

    The basic idea here is to treat null as unknown value and return null if the result is undetermined and a bool if the result doesn't change no matter what you put into the null argument.

    Now you want to define a short circuiting logical and and or on this type. If you do that using the C# true and false operators, both of which return false on a null argument you get the desired behavior. With a c like behavior you don't.

    The C# designers probably didn't care about logical and/or on integers like in your example. Integers are no boolean values and as such should not offer logical operators. That bool and integer are the same thing is one of c's historic properties that a new language doesn't need to mirror. And the distinction of bitwise vs logical operators on ints only exists in c due to c's inability to distinguish booleans and integers. This distinction is unnecessary in languages which distinguish these types.

    Calling & a bitwise operation is misleading in C#. The essence of && vs & isn't logical vs bitwise and. That isn't determined by which operator you use, but by which types you use. On logical types (bool, bool?, DBBool) both operators are logical, and on integer types & is bitwise and && doesn't make sense, since you can't short-circuit on integers. The essence of && vs & is that the first short-circuits and the second doesn't.

    And for the cases where the operators are defined at all this coincides with the c interpretation. And since && isn't defined on integers, because that doesn't make sense with the C# interpretation of && your problem of how && is evaluated on integers does not exist.

提交回复
热议问题