Apache Spark: pyspark crash for large dataset

后端 未结 5 1223
清歌不尽
清歌不尽 2021-01-01 20:48

I am new to Spark. and I have input file with training data 4000x1800. When I try to train this data (written python) get following error:

  1. 14/11/15 22:39:13

5条回答
  •  梦谈多话
    2021-01-01 21:11

    1. One possibility is that there is an exception in parsePoint, wrap the code in a try except block and print out the exception.
    2. Check your --driver-memory parameter, make it greater.

提交回复
热议问题