How to change memory per node for apache spark worker

前端 未结 5 1936
别跟我提以往
别跟我提以往 2020-12-02 18:50

I am configuring an Apache Spark cluster.

When I run the cluster with 1 master and 3 slaves, I see this on the master monitor page:

Memory
2.0 GB (51         


        
5条回答
  •  不知归路
    2020-12-02 19:10

    In my case, I use ipython notebook server to connect to spark. I want to increase the memory for executor.

    This is what I do:

    from pyspark import SparkContext
    from pyspark.conf import SparkConf
    
    conf = SparkConf()
    conf.setMaster(CLUSTER_URL).setAppName('ipython-notebook').set("spark.executor.memory", "2g")
    
    sc = SparkContext(conf=conf)
    

提交回复
热议问题