Why does !{}[true] evaluate to true in JavaScript?

前端 未结 10 1461
一个人的身影
一个人的身影 2020-12-12 17:55

{}[true] is [true] and ![true] should be false.

So why does !{}[true] evaluate to true

相关标签:
10条回答
  • 2020-12-12 18:28

    I believe that's because plain {}[true] is parsed as an empty statement block (not an object literal) followed by an array containing true, which is true.

    On the other hand, applying the ! operator makes the parser interpret {} as an object literal, so the following {}[true] becomes a member access that returns undefined, and !{}[true] is indeed true (as !undefined is true).

    0 讨论(0)
  • 2020-12-12 18:28

    Because

    {}[true]
    

    evaluates to undefined, and !undefined is true.

    From @schlingel:

    true is used as key and {} as hash map. There doesn't exist an property with the key true so it returns undefined. Not undefined is true, as expected.

    Console session (Node.js [0.10.17]):

    > {}[true]
    undefined
    > !{}[true]
    true
    > [true]
    [ true ]
    > ![true]
    false
    >
    

    However, in the Google Chrome console:

    > !{}[true]
    true
    

    So, no inconsistencies. You're probably using an old version of the JavaScript VM. For those who need further evidence:

    Enter image description here

    UPDATE

    With Firefox, it also evaluates to true:

    Enter image description here

    0 讨论(0)
  • 2020-12-12 18:30

    Because {}[true] does not return true, but undefined, and undefined is evaluated as false:

    http://jsfiddle.net/67GEu/

    'use strict';
    var b = {}[true];
    alert(b); // undefined
    b = !{}[true];
    alert(b); // true
    
    0 讨论(0)
  • 2020-12-12 18:30

    The reason for the confusion is down to a misunderstanding of your first assertion:

    {}[true] is [true]

    What you're seeing when you run it is the result of an ambiguity. Javascript has a defined set of rules as to how to handle ambiguities like this, and in this case, it breaks what you see as a signle statement down into two separate statements.

    So Javascript sees the above code as two separate statements: Firstly, there is a {}, and then there is an entirely separate [true]. The second statement is what is giving you the result [true]. The first statement {} is effetively entirely ignored.

    You can prove this by trying the following:

    ({}[true])
    

    ie wrapping the whole thing in brackets to force the interpreter to read it as a single statement.

    Now you'll see that the actual value of your statement is undefined. (this will also help us later to understand the next part)

    Now we know that the initial part of your question is a red herring, so let's move onto the final part of the question:

    So why does !{}[true] evaluate to true?

    Here, we have the same statement, but with a ! appended to the front of it.

    In this case, Javascript's rules tell it to evaluates the entire thing as a single statement.

    Refer back to what happened when we wrapped the earlier statement in brackets; we got undefined. This time, we are effectively doing the same thing, but putting a ! in front of it. So your code can be simplified as !undefined, which is true.

    Hopefully that explains it a bit.

    It is a complex beast, but the lesson to learn here is to use brackets around your statements when evaluating them in the console, to avoid spurious results like this.

    0 讨论(0)
提交回复
热议问题