A decade or two ago, it was worthwhile to write numerical code to avoid using multiplies and divides and use addition and subtraction instead. A good example is using forwa
I can't find a definitive reference, but extensive experimentation tells me that float multiplication nowadays is just about the same speed as addition and subtraction, while division isn't (but not "many times" slower, either). You can get the intuition you desire only by running your own experiments -- remember to generate the random numbers (millions of them) in advance, read them before you start timing, and use the CPU's performance counters (with no other process running, as much as you can stop them from) for accurate measurement!