I am running an embedded application on ARM9 board, where total flash size is 180MB only. I am able to run gdb
, but when I do
(gdb) generate-core-dump
I get an error
warning: Memory read failed for corefile section, 1048576 bytes at 0x4156c000.
warning: Memory read failed for corefile section, 1048576 bytes at 0x50c00000.
Saved corefile core.5546
The program is running. Quit anyway (and detach it)? (y or n) [answered Y; input not from terminal]
Tamper Detected
**********OUTSIDE ifelse 0*********
length validation is failed
I also set ulimit -c 50000
but still the core dump exceeds this limit. When I do ls -l
to check file size it is over 300 MB. In this case how should I limit the size of core dump?
GDB does not respect 'ulimit -c', only the kernel does.
It's not clear whether you run GDB on target board, or on a development host (and using gdbserver on target). You probably should use the latter, which will allow you to collect full core dump.
Truncated core dumps are a pain anyway, as often they will not contain exactly the info you need to debug the problem.
in your shell rc-file:
limit coredumpsize 50000 # or whatever limit size you like
that should set the limit for everything, including GDB
Note:
If you set it to 0 , you can make sure your home directory is not cluttered with core dump files.
When did you use ulimit -c ? It must be used before starting the program for which you're generating a core dump, and inside the same session.
来源:https://stackoverflow.com/questions/7737465/how-to-limit-the-size-of-core-dump-file-when-generating-it-using-gdb