I have a gnarly piece of code whose time-efficiency I would like to measure. Since estimating this complexity from the code itself is hard, I want to place it in a loop and time
First things first, I don't know of an accepted, "scientific" way to scale repetitions and problem size to achieve faster, more accurate time-vs-size plots, so I cannot say anything on the matter.
Other than that, for a better measurement of time complexity I would suggest to measure the average execution time for a fixed size and compare it with the average execution time measured in the previous cycle. After that you increase the size of the input data and repeat the measurement.
This is similar to one of the methods used in Numerical Analysis to estimate errors of numerical methods. You just adapt it to estimate the average error in the execution time of the implementation of your algorithm.
So, to cut it short:
Let me know if something is unclear.