I\'m curious as to whether or not there is a real difference between the money datatype and something like decimal(19,4) (which is what money uses
Everything is dangerous if you don't know what you are doing
Even high-precision decimal types can't save the day:
declare @num1 numeric(38,22)
declare @num2 numeric(38,22)
set @num1 = .0000006
set @num2 = 1.0
select @num1 * @num2 * 1000000
1.000000 <- Should be 0.6000000
The money types are integers
The text representations of smallmoney and decimal(10,4) may look alike, but that doesn't make them interchangeable. Do you cringe when you see dates stored as varchar(10)? This is the same thing.
Behind the scenes, money/smallmoney are just a bigint/int The decimal point in the text representation of money is visual fluff, just like the dashes in a yyyy-mm-dd date. SQL doesn't actually store those internally.
Regarding decimal vs money, pick whatever is appropriate for your needs. The money types exist because storing accounting values as integer multiples of 1/10000th of unit is very common. Also, if you are dealing with actual money and calculations beyond simple addition and subtraction, you shouldn't be doing that at the database level! Do it at the application level with a library that supports Banker's Rounding (IEEE 754)