I\'m having some memory issues while using a python script to issue a large solr query. I\'m using the solrpy library to interface with th
I've implemented hruske's advice of "fork and die". I'm using os.fork() to execute the memory intensive section of code in a child process, then I let the child process exit. The parent process executes an os.waitpid() on the child so that only one thread is executing at a given time.
If anyone sees any pitfalls with this solution, please chime in.