What are the differences between this line:
var a = parseInt(\"1\", 10); // a === 1
and this line
var a = +\"1\"; // a ===
The table in thg435's answer I believe is comprehensive, however we can summarize with the following patterns:
true to 1, but "true" to NaN.parseInt is more liberal for strings that are not pure digits. parseInt('123abc') === 123, whereas + reports NaN.Number will accept valid decimal numbers, whereas parseInt merely drops everything past the decimal. Thus parseInt mimics C behavior, but is perhaps not ideal for evaluating user input.parseInt, being a badly designed parser, accepts octal and hexadecimal input. Unary plus only takes hexademical.Falsy values convert to Number following what would make sense in C: null and false are both zero. "" going to 0 doesn't quite follow this convention but makes enough sense to me.
Therefore I think if you are validating user input, unary plus has correct behavior for everything except it accepts decimals (but in my real life cases I'm more interested in catching email input instead of userId, value omitted entirely, etc.), whereas parseInt is too liberal.