I\'ve a process that is logging messages to a file.
I want to implement another process (in Python) that parses these logs (as they are written to the file), filter
Thanks everyone for the answers. I found this as well. http://www.dabeaz.com/generators/follow.py
C programs usually seek to the current position to clear any “end of file” flags. But as @9000 correctly pointed out, python apparently takes care of this, so you can read from the same file repeatedly even if it has reached end of file.
You might have to take care of incomplete lines, though. If your application writes its log in pieces, then you want to make sure that you handle whole lines, and not those pieces. The following code will accomplish that:
f = open('some.log', 'r')
while True:
line = ''
while len(line) == 0 or line[-1] != '\n':
tail = f.readline()
if tail == '':
time.sleep(0.1) # avoid busy waiting
# f.seek(0, io.SEEK_CUR) # appears to be unneccessary
continue
line += tail
process(line)
No need to run tail -f
. Plain Python files should work:
with open('/tmp/track-this') as f:
while True:
line = f.readline()
if line:
print line
This thing works almost exactly like tail -f
. Check it by running in another terminal:
echo "more" >> /tmp/track-this
# alt-tab here to the terminal with Python and see 'more' printed
echo "even more" >> /tmp/track-this
Don't forget to create /tmp/track-this
before you run the Python snippet.
Parsing and taking appropriate actions are up to you. Probably long actions should be taken in separate threads/processes.
Stop condition is also up to you, but plain ^C
works.