I\'m writing a program, it works fine but when it loads the database(a 100MB text file) to a list it\'s memory usage becomes 700-800MB
Code used to load the file to
You should use the file object as an iterator to reduce memory usage from file. You could then process database list in chunks rather than all together. For example:
results = []
database = []
for line in open("database/db.hdb"):
line = line.split(':')
#You could then manage database in chunks?
database.append(line)
if len(database) > MAX:
#dosomething with database list so far to get result
results.append(process_database(database))
database = []
#do something now with individual results to make one result
combine_results(results)