How to set Spark MemoryStore size when running in IntelliJ Scala Console?

♀尐吖头ヾ 提交于 2019-12-08 13:50:48

问题


I am running Spark code as script in Intellij (CE 2017.1) Scala Console on Linux 64 (Fedora 25). I set SparkContext at the start:

import org.apache.spark.{SparkConf, SparkContext}
val conf = new SparkConf().
  setAppName("RandomForest").
  setMaster("local[*]").
  set("spark.local.dir", "/spark-tmp").
  set("spark.driver.memory", "4g").
  set("spark.executor.memory", "4g")

val sc = new SparkContext(conf)

But the running SparkContext always starts with the same line:

17/03/27 20:12:21 INFO SparkContext: Running Spark version 2.1.0

17/03/27 20:12:21 INFO MemoryStore: MemoryStore started with capacity 871.6 MB

17/03/27 20:12:21 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.65:38119 with 871.8 MB RAM, BlockManagerId(driver, 192.168.1.65, 38119, None)

And the Executors tab in the Spark web UI shows the same amount. Exporting _JAVA_OPTIONS="-Xms2g -Xmx4g" from the terminal before start also had no effect here.


回答1:


The only way to increase Spark MemoryStore and eventually Storage memory Executors tab of web UI was to add -Xms2g -Xmx4g in VM options directly in Intellij Scala Console settings before start.

Now the info line prints:

17/03/27 20:12:21 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB

17/03/27 20:12:21 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.65:41997 with 2004.6 MB RAM, BlockManagerId(driver, 192.168.1.65, 41997, None)

and the Spark web UI Executors tab Storage Memory shows 2.1 GB.



来源:https://stackoverflow.com/questions/43054268/how-to-set-spark-memorystore-size-when-running-in-intellij-scala-console

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!