In C#, the following type-inference works:
var s = \"abcd\";
But why can\'t the type be inferred when the variable is a constant?
T
While I disagree with Mr. Lippert's reasoning, there is a good reason not to allow implicit typing of named constants: consider the meaning of the following code, if typed constants did not have to specify their type explicitly:
const var ScaleFactor = 2500000000; // Type 'Int64'
...
int thisValue = getNextInt();
total += thisValue * ScaleFactor;
Now suppose that the scale factor needs to be notched down by 20%. What would be the effect of changing the value to 2000000000? While the problem of having an Int64 become an Int32 would occur even if the value were specifed in the code [e.g. when changing total += thisValue * 2500000000;
to total += thisValue * 2000000000;
the change would be adjacent to the code that requires that the value be an Int64
. By contrast, a const
declaration would likely be far removed from the code it effects, so there would be no visible way of knowing whether code somewhere might rely upon a constant being a long type.