Split huge (95Mb) JSON array into smaller chunks?

后端 未结 4 1213
不知归路
不知归路 2021-01-18 06:04

I exported some data from my database in the form of JSON, which is essentially just one [list] with a bunch (900K) of {objects} inside it.

Trying to import it on my

4条回答
  •  情歌与酒
    2021-01-18 06:40

    I turned phihag's and mark's work into a tiny script (gist)

    also copied below:

    #!/usr/bin/env python 
    # based on  http://stackoverflow.com/questions/7052947/split-95mb-json-array-into-smaller-chunks
    # usage: python json-split filename.json
    # produces multiple filename_0.json of 1.49 MB size
    
    import json
    import sys
    
    with open(sys.argv[1],'r') as infile:
        o = json.load(infile)
        chunkSize = 4550
        for i in xrange(0, len(o), chunkSize):
            with open(sys.argv[1] + '_' + str(i//chunkSize) + '.json', 'w') as outfile:
                json.dump(o[i:i+chunkSize], outfile)
    

提交回复
热议问题