What would be the best and most accurate way to determine how long it took to process a routine, such as a procedure of function?
I ask because I am currently trying
I ask because I am currently trying to optimize a few functions
It is natural to think that measuring is how you find out what to optimize, but there's a better way.
If something takes a large enough fraction of time (F) to be worth optimizing, then if you simply pause it at random, F is the probability you will catch it in the act. Do that several times, and you will see precisely why it's doing it, down to the exact lines of code.
More on that. Here's an example.
Fix it, and then do an overall measurement to see how much you saved, which should be about F. Rinse and repeat.