I believe, that the usage of preprocessor directives like #if UsingNetwork
is bad OO practice - other coworkers do not.
I think, when using an IoC container (e.
IMO it is important to differentiate between #if and #define. Both can be useful and both can be overused. My experience is that #define is more likely to be overused than #if.
I spent 10+ years doing C and C++ programming. In the projects I worked on (commercially available software for DOS / Unix / Macintosh / Windows) we used #if and #define primarily to deal with code portability issues.
I spent enough time working with C++ / MFC to learn to detest #define when it is overused - which I believe to be the case in MFC circa 1996.
I then spent 7+ years working on Java projects. I cannot say that I missed the preprocessor (although I most certainly did miss things like enumerated types and templates / generics which Java did not have at the time).
I've been working in C# since 2003. We have made heavy use of #if and [Conditional("DEBUG")] for our debug builds - but #if is just a more convenient, and slightly more efficient way of doing the same things we did in Java.
Moving forward, we have started to prepare our core engine for Silverlight. While everything we are doing could be done without #if, it is less work with #if which means we can spend more time adding features that our customers are asking for. For example, we have a value class which encapsulates a system color for storage in our core engine. Below are the first few lines of code. Because of the similarity between System.Drawing.Color and System.Windows.Media.Color, the #define at the top gets us a lot of functionality in normal .NET and in Silverlight without duplicating code:
using System;
using System.Collections.Generic;
using System.Text;
using System.Diagnostics;
#if SILVERLIGHT
using SystemColor = System.Windows.Media.Color;
#else
using SystemColor = System.Drawing.Color;
#endif
namespace SpreadsheetGear.Drawing
{
/// <summary>
/// Represents a Color in the SpreadsheetGear API and provides implicit conversion operators to and from System.Drawing.Color and / or System.Windows.Media.Color.
/// </summary>
public struct Color
{
public override string ToString()
{
//return string.Format("Color({0}, {1}, {2})", R, G, B);
return _color.ToString();
}
public override bool Equals(object obj)
{
return (obj is Color && (this == (Color)obj))
|| (obj is SystemColor && (_color == (SystemColor)obj));
}
...
The bottom line for me is that there are many language features which can be overused, but this is not a good enough reason to leave these features out or to make strict rules prohibiting their use. I must say that moving to C# after programming in Java for so long helps me to appreciate this because Microsoft (Anders Hejlsberg) has been more willing to provide features which might not appeal to a college professor, but which make me more productive in my job and ultimately enable me to build a better widget in the limited time anybody with a ship date has.
The support of preprocessing in C# is highly minimal.... verging on useless. Is that Evil?
Is the Preprocessor anything to do with OO? Surely it's for build configuration.
For instance I have a lite version and a pro-version of my app. I might want to exclude some code on the lite withour having to resort to building very similar versions of the code.
I might not want to ship a lite version which is the pro version with different runtime flags.
Tony
"The preprocessor is the incarnation of evil, and the cause of all pain on earth" -Robert (OO Guru)
One problem with the preprocessor #ifdef's is that they effectively duplicate the number of compiled versions that, in theory, you should test thorougly so that you can say that your delivered code is correct.
#ifdef DEBUG
//...
#else
//...
Ok, now I can produce the "Debug" version and the "Release" version. This is ok for me, I always do it, because I have assertions and debug traces which are only executed in the debug version.
If someone comes and writes (real life example)
#ifdef MANUALLY_MANAGED_MEMORY
...
And they write a pet optimization which they propagate to four or five different classes, then suddenly you have FOUR possible ways to compile your code.
If only you have another #ifdef-dependant code then you'll have EIGHT possible versions to generate, and what's more disturbing, FOUR of them will be possible release versions.
Of course runtime if()'s, like loops and whatever, create branches that you have to test - but I find it much more difficult to guarantee that every compile time variation of the configuration remains correct.
This is the reason why I think, as a policy, all #ifdef's except the one for Debug/Release version should be temporary, i.e. you're doing an experiment in development code and you'll decide, soon, if it stays one way or the other.
I have no guru statement regarding to the usage of preprocessor directives in my mind and can not add a reference to a famous one. But I want to give you a link to a simple sample found at Microsoft's MSDN.
#define A
#undef B
class C
{
#if A
void F() {}
#else
void G() {}
#endif
#if B
void H() {}
#else
void I() {}
#endif
}
This will result in the simple
class C
{
void F() {}
void I() {}
}
and I think it is not very easy to read because you have to look at the top to see what exactly is defined at this point. This is getting more complex if you have defined it elsewhere.
For me it looks much simpler to create different implementations and inject them into a caller instead of switching defines to create "new" class definitions. (... and because of this I understand why you compare the usage of preprocessor defintions with the usage of IoC instead). Beside the horrible readability of code using preprocessor instructions, I rarely used preprocessor definitions because they increase the complexity of testing your code because they result in multiple paths (but this is a problem of having multiple implementations injected by external IoC-Container, too).
Microsoft itself has used a lot of preprocessor definitions within the win32 api and you might know/remember the ugly switching between char and w_char method calls.
Maybe you should not say "Don't use it". Tell them "How to use it" and "When to use it" instead. I think everyone will agree with you if you're coming up with good (better understandable) alternatives and can describe the risks of using preprocessor defines/makros.
No need for a guru... just be a guru. ;-)
Bjarne Stroustrap provides his answer (in general, not specific to IoC) here
So, what's wrong with using macros?
(excerpt)
Macros do not obey the C++ scope and type rules. This is often the cause of subtle and not-so-subtle problems. Consequently, C++ provides alternatives that fit better with the rest of C++, such as inline functions, templates, and namespaces.