Python:How to progressively load one large parquet file from hard disk for tensorflow-autoencoder batch training in a local machine?

前端 未结 0 873
Happy的楠姐
Happy的楠姐 2021-01-04 02:03

I have a parquet file of size 2GB. It contains 700Million rows. I would like to progressively load the parquet file from the hard disk and feed them to tensorflow-autoencode

相关标签:
回答
  • 消灭零回复
提交回复
热议问题