There is something that I cannot understand in C#. You can cast an out-of-range int
into an enum
and the compiler does not flinch. Imagine this
Guessing about 'why' is always dangerous, but consider this:
enum Direction { North =1, East = 2, South = 4, West = 8 }
Direction ne = Direction.North | Direction.East;
int value = (int) ne; // value == 3
string text = ne.ToString(); // text == "3"
When the [Flags]
attribute is put in front of the enum, that last line changes to
string text = ne.ToString(); // text == "North, East"
You don't need to deal with exceptions. The precondition for the method is that callers should use the enum, not cast any willy nilly int to the said enum. That would be madness. Isn't the point of enums not to use the ints?
Any dev who would cast 17 to the Colour enum would need 17 kicks up their backside as far as I'm concerned.
I certainly see Cesar's point, and I remember this initially confused me too. In my opinion enums, in their current implementation, are indeed a little too low level and leaky. It seems to me that there would be two solutions to the problem.
1) Only allow arbitrary values to be stored in an enum if its definition had the FlagsAttribute. This way, we can continue to use them for a bitmask when appropriate (and explicitly declared), but when used simply as placeholders for constants we would get the value checked at runtime.
2) Introduce a separate primitive type called say, bitmask, that would allow for any ulong value. Again, we restrict standard enums to declared values only. This would have the added benefit of allowing the compiler to assign the bit values for you. So this:
[Flags]
enum MyBitmask
{
FirstValue = 1, SecondValue = 2, ThirdValue = 4, FourthValue = 8
}
would be equivalent to this:
bitmask MyBitmask
{
FirstValue, SecondValue, ThirdValue, FourthValue
}
After all, the values for any bitmask are completely predictable, right? As a programmer, I am more than happy to have this detail abstracted away.
Still, too late now, I guess we're stuck with the current implementation forever. :/
if (!Enum.IsDefined(typeof(Colour), 17))
{
// Do something
}
Short version:
Don't do this.
Trying to change enums into ints with only allowing valid values (maybe a default fallback) requires helper methods. At that point you have don't have an enum--you reallly have a class.
Doubly so if the ints are improtant--like Bryan Rowe said.
Not sure about why, but I recently found this "feature" incredibly useful. I wrote something like this the other day
// a simple enum
public enum TransmissionStatus
{
Success = 0,
Failure = 1,
Error = 2,
}
// a consumer of enum
public class MyClass
{
public void ProcessTransmissionStatus (TransmissionStatus status)
{
...
// an exhaustive switch statement, but only if
// enum remains the same
switch (status)
{
case TransmissionStatus.Success: ... break;
case TransmissionStatus.Failure: ... break;
case TransmissionStatus.Error: ... break;
// should never be called, unless enum is
// extended - which is entirely possible!
// remember, code defensively! future proof!
default:
throw new NotSupportedException ();
break;
}
...
}
}
question is, how do I test that last case clause? It is completely reasonable to assume someone may extend TransmissionStatus
and not update its consumers, like poor little MyClass
above. Yet, I would still like to verify its behaviour in this scenario. One way is to use casting, such as
[Test]
[ExpectedException (typeof (NotSupportedException))]
public void Test_ProcessTransmissionStatus_ExtendedEnum ()
{
MyClass myClass = new MyClass ();
myClass.ProcessTransmissionStatus ((TransmissionStatus)(10));
}