Efficient way of reading large txt file in python

后端 未结 3 1984
隐瞒了意图╮
隐瞒了意图╮ 2021-01-27 06:35

I\'m trying to open a txt file with 4605227 rows (305 MB)

The way I have done this before is:

data = np.loadtxt(\'file.txt\', delimiter=\'\\t\', dtype=st         


        
3条回答
  •  庸人自扰
    2021-01-27 07:23

    Rather than reading it in with numpy you could just read it directly in as a Pandas DataFrame. E.g., using the pandas.read_csv function, with something like:

    df = pd.read_csv('file.txt', delimiter='\t', usecols=["a", "b", "c", "d", "e", "f", "g", "h", "i"])
    

提交回复
热议问题