Optimization is tricky. Consider the following examples:
- Deciding on implementing two servers, each doing its own job, instead of implementing a single server that will do both jobs.
- Deciding to go with one DBMS and not another, for performance reasons.
- Deciding to use a specific, non-portable API when there is a standard (e.g., using Hibernate-specific functionality when you basically need the standard JPA), for performance reasons.
- Coding something in assembly for performance reasons.
- Unrolling loops for performance reasons.
- Writing a very fast but obscure piece of code.
My bottom line here is simple. Optimization is a broad term. When people talk about premature optimization, they don't mean you need to just do the first thing that comes to mind without considering the complete picture. They are saying you should:
- Concentrate on the 80/20 rule - don't consider ALL the possible cases, but the most probable ones.
- Don't over-design stuff without any good reason.
- Don't write code that is not clear, simple and easily maintainable if there is no real, immediate performance problem with it.
It really all boils down to your experience. If you are an expert in image processing, and someone requests you do something you did ten times before, you will probably push all your known optimizations right from the beginning, but that would be ok. Premature optimization is when you're trying to optimize something when you don't know it needs optimization to begin with. The reason for that is simple - it's risky, it's wasting your time, and it will be less maintainable. So unless you're experienced and you've been down that road before, don't optimize if you don't know there's a problem.