I have ~200 short text files (50kb) that all have a similar format. I want to find a line in each of those files that contains a certain string and then write that line plus
Writing line by line can be slow when working with large data. You can accelerate the read/write operations by reading/writing a bunch of lines at once.
from itertools import slice
f1 = open('Test.txt')
f2 = open('Output.txt', 'a')
bunch = 500
lines = list(islice(f1, bunch))
f2.writelines(lines)
f1.close()
f2.close()
In case your lines are too long and depending on your system, you may not be able to put 500 lines in a list. If that's the case, you should reduce the bunch size and have as many read/write steps as needed to write the whole thing.