java.lang.OutOfMemoryError: Unable to acquire 100 bytes of memory, got 0

前端 未结 5 2068
眼角桃花
眼角桃花 2020-12-15 06:00

I\'m invoking Pyspark with Spark 2.0 in local mode with the following command:

pyspark --executor-memory 4g --driver-memory 4g

The input da

5条回答
  •  误落风尘
    2020-12-15 06:20

    In my case the driver was smaller than the workers. Issue was resolved by making the driver larger.

提交回复
热议问题