I have a python script that continuously stores tweets related to tracked keywords to a file. However, the script tends to crash repeatedly due to an error appended below.
One option would be to try the module multiprocessing. I would argue for two reasons.
I have taken a different approach entirely, but that is partly because I am saving my tweets at regular(or supposedly regular) intervals. @ Eugeune Yan, I think the try except is a simple and elegant way to deal with the problem. Although, and hopefully someone will have a comment on this; you don't really know when or if it failed with that method, but idk if that really matters(and it would be easy to write a few lines to make that happen).
import tiipWriter #Twitter & Textfile writer I wrote with Tweepy.
from add import ThatGuy # utility to supply log file names that won't overwrite old ones.
import multiprocessing
if __name__ == '__main__':
#number of time increments script needs to run
n = 60
dir = "C:\\Temp\\stufffolder\\twiitlog"
list = []
print "preloading logs"
ThatGuy(n,dir,list) #Finds any existing logs in the folder and one-ups it
for a in list:
print "Collecting Tweets....."
# this is my twitter/textfile writer process
p = multiprocessing.Process(target=tiipWriter.tiipWriter,args = (a,))
p.start()
p.join(1800) # num of seconds the process will run
if p.is_alive():
print " \n Saving Twitter Stream log @ " + str(a)
p.terminate()
p.join()
a = open(a,'r')
a.close()
if a.closed == True:
print "File successfully closed"
else: a.close()
print "jamaica" #cuz why not