When I used to program embedded systems and early 8/16-bit PCs (6502, 68K, 8086) I had a pretty good handle on exacly how long (in nanoseconds or microseconds) each instruct
It took almost 11 years, but I have an estimate. Your loop is about 10 ops * 100 million iterations, so approximately 1 billion ops. On a 2.3 GHz machine, I would estimate on the order of 0.4 seconds. When I tested it, I actually got 1.2 seconds. So it's within one order of magnitude.
Just take your core frequency, estimate the ops, and divide. This gives a very rough estimate and I've never been more than an order of magnitude off whenever I test empirically. Just make sure your op estimates are reasonable.