We’ve found that the unit tests we’ve written for our C#/C++ code have really paid off. But we still have thousands of lines of business logic in stored procedures, which o
I'm in the exact same situation as the original poster. It comes down to performance versus testability. My bias is towards testability (make it work, make it right, make it fast), which suggests keeping business logic out of the database. Databases not only lack the testing frameworks, code factoring constructs, and code analysis and navigation tools found in languages like Java, but highly factored database code is also slow (where highly factored Java code is not).
However, I do recognize the power of database set processing. When used appropriately, SQL can do some incredibly powerful stuff with very little code. So, I'm ok with some set-based logic living in the database even though I will still do everything I can to unit test it.
On a related note, it seems that very long and procedural database code is often a symptom of something else, and I think such code can be converted to testable code without incurring a performance hit. The theory is that such code often represents batch processes that periodically process large amounts of data. If these batch processes were to be converted into smaller chunks of real-time business logic that runs whenever the input data is changed, this logic could be run on the middle-tier (where it can be tested) without taking a performance hit (since the work is done in small chunks in real-time). As a side-effect, this also eliminates the long feedback-loops of batch process error handling. Of course this approach won't work in all cases, but it may work in some. Also, if there is tons of such untestable batch processing database code in your system, the road to salvation may be long and arduous. YMMV.