How do I decrease the memory used by a large list in python

后端 未结 2 871
别跟我提以往
别跟我提以往 2021-01-15 01:33

I\'m writing a program, it works fine but when it loads the database(a 100MB text file) to a list it\'s memory usage becomes 700-800MB

Code used to load the file to

2条回答
  •  情话喂你
    2021-01-15 02:09

    You should use the file object as an iterator to reduce memory usage from file. You could then process database list in chunks rather than all together. For example:

    results = []
    database = []
    for line in open("database/db.hdb"):
        line = line.split(':')
        #You could then manage database in chunks?
        database.append(line)
        if len(database) > MAX:
            #dosomething with database list so far to get result
            results.append(process_database(database))
            database = []
    #do something now with individual results to make one result
    combine_results(results)
    

提交回复
热议问题