I am using an IPython 6.2.1 integration with Eclipse/PyDev on Ubuntu. Python version is 3.5.2.
I often see people timing a script like
>>>%timeit for _ in range(1000): True
10000 loops, best of 3: 37.8 µs per loop
When I perform the same operation, my output is
>>>%timeit for _ in range(1000): True
20.8 µs ± 353 ns per loop (mean ± std. dev. of 7 runs, 10000 loops each)
Imho, "best of 3" is the better measure, so I would like to change my output.
I read both, the IPython and the Python timeit documentation. They both don't even mention, that the output could differ from "best of 3". Is this a question of Linux/Eclipse/PyDev implementation or is there a way to change the output of the timeit
module?
P.S.: The same happens in the Eclipse console, when I use timeit
, so IPython is probably irrelevant here.
>>>timeit '"-".join(str(n) for n in range(100))'
11 ns ± 0.0667 ns per loop (mean ± std. dev. of 7 runs, 100000000 loops each)
Unutbu pointed out that you can achieve the desired behaviour from within a program. Running the first script calling timeit.main()
here indeed returns the best of 3. But I would prefer a version that I can run interactively in the Eclipse console.
The display is generated by a TimeitResult
object (see the timeit??
code listing). With the -o option I get that object which I can examine:
In [95]: %timeit -o np.einsum('bdc,ac->ab', a, b, optimize=False)
170 µs ± 27.5 ns per loop (mean ± std. dev. of 7 runs, 10000 loops each)
Out[95]: <TimeitResult : 170 µs ± 27.5 ns per loop (mean ± std. dev. of 7 runs, 10000 loops each)>
In [96]: res = _
In [97]: print(res)
170 µs ± 27.5 ns per loop (mean ± std. dev. of 7 runs, 10000 loops each)
In [98]: vars(res)
Out[98]:
{'_precision': 3,
'all_runs': [1.6981208299985155,
1.6976570829865523,
1.6978941220149864,
1.6976559630129486,
1.6976608499826398,
1.697147795028286,
1.6977746890042908],
'best': 0.0001697147795028286,
'compile_time': 0.0,
'loops': 10000,
'repeat': 7,
'timings': [0.00016981208299985155,
0.00016976570829865524,
0.00016978941220149863,
0.00016976559630129485,
0.00016976608499826398,
0.0001697147795028286,
0.0001697774689004291],
'worst': 0.00016981208299985155}
It looks like the information to generate the best of 3
display is still there, but the formatter is gone. It might be found in an older version.
@property
def average(self):
return math.fsum(self.timings) / len(self.timings)
Code is in IPython/core/magics/execution.py
This syntax std/mean/dev was added to IPython in 5 Oct 2016. See the issue #9984 https://github.com/ipython/ipython/pull/9984 about this improvement. And the implementation is here:
"Added mean + stdv to the %timeit magic - New tests pending - "
but syntax "best of 3" comes from python timeit module in main() function and this is generally the same in python2 and 3.
来源:https://stackoverflow.com/questions/48733545/ipython-timeit-magic-changing-output-from-mean-std-to-best-of-3