问题
I'm not advocating that this would ever be a good idea, but I've found that you can crash Python (2.7 and 3.2 checked) by running eval
on a large enough input string:
def kill_python(N):
S = '+'.join((str(n) for n in xrange(N)))
return eval(S)
On my computer S
can be generated just fine, but for values of approximately N>74900
, Python will fail with Segmentation fault (core dumped)
. Is there a limit to the length of string (or parse tree) that the interpreter can handle?
Note: I don't need to do this, to me this is a deeper question reflecting my ignorance of what goes on inside the box. I'd like to understand why Python fails here, and so catastrophically (why not throw an exception?)
回答1:
This issue is caused by a stack overflow in the CPython compiler. An easy way to reproduce the same issue is
>>> code = compile("1" + "+1" * 1000000, "", "eval")
Segmentation fault
which proves that the segfault is happening at the compile stage, not during evaluation. (Of course this is also easy to confirm with gdb.)
[Side note: For smaller expressions, the compiler would apply constant folding here anyway, so the only thing happening during the execution of the code is to load the result:
>>> code = compile("1" + "+1" * 1000, "", "eval")
>>> eval(code)
1001
>>> dis.dis(code)
1 0 LOAD_CONST 1000 (1001)
3 RETURN_VALUE
End of side note.]
This issue is a known defect. The Python developers collected several ways to crash the Python interpreter in the directory Lib/test/crashers of the source distribution. The one corresponding to this issue is Lib/test/crashers/compiler_recursion.py.
来源:https://stackoverflow.com/questions/11635211/why-is-there-a-length-limit-to-pythons-eval