I\'m curious as to whether or not there is a real difference between the money datatype and something like decimal(19,4) (which is what money uses
All the previous posts bring valid points, but some don't answer the question precisely.
The question is: Why would someone prefer money when we already know it is a less precise data type and can cause errors if used in complex calculations?
You use money when you won't make complex calculations and can trade this precision for other needs.
For example, when you don't have to make those calculations, and need to import data from valid currency text strings. This automatic conversion works only with MONEY data type:
SELECT CONVERT(MONEY, '$1,000.68')
I know you can make your own import routine. But sometimes you don't want to recreate a import routine with worldwide specific locale formats.
Another example, when you don't have to make those calculations (you need just to store a value) and need to save 1 byte (money takes 8 bytes and decimal(19,4) takes 9 bytes). In some applications (fast CPU, big RAM, slow IO), like just reading huge amount of data, this can be faster too.