Spark to read a big file as inputstream

拥有回忆 提交于 2019-12-08 06:19:27

问题


I know spark built in method can have partition and read huge chunk of file and distributed as rdd using textfile. However, i am reading this in a customized encrytped filessytem which spark does not support by nature. One way i can think of is to read an inputstream instead and loads multiple lines and distributed to executor. Keep reading until all file is loaded. So no executor will blow up due to out of memory error. Is that possible to do this in spark?


回答1:


you can try lines.take(n) for different n to find the limit of your cluster.
or

spark.readStream.option("sep", ";").csv("filepath.csv")


来源:https://stackoverflow.com/questions/43023884/spark-to-read-a-big-file-as-inputstream

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!