When I used to program embedded systems and early 8/16-bit PCs (6502, 68K, 8086) I had a pretty good handle on exacly how long (in nanoseconds or microseconds) each instruct
It is nearly impossible to provide accurate timing information that you are expecting in a way that will be USEFUL to you.
The following concepts affect instruction timing; some can vary from moment to moment:
Consult a book on modern computer architecture if you need any further explanation on the above concepts.
The best way to measure the speed of your code is (surprise!) to measure the speed of your code running the same workload and under the same conditions as you expect it to when "in the real world".