What is the right way to edit spark-env.sh before running spark-shell?

后端 未结 2 771
爱一瞬间的悲伤
爱一瞬间的悲伤 2020-12-30 15:41

I am running spark on my local windows machine. I am able to start spark shell successfully.

I want to edit the spark-env.sh file residing in conf/ folder. What is t

相关标签:
2条回答
  • 2020-12-30 15:57

    You must have to use export to add any configuration in *.sh file. So in spark-env.sh file use following example,

    export SPARK_MASTER_IP=192.165.5.1
    export SPARK_EXECUTOR_MEMORY=2g
    #OR export SPARK_EXECUTOR_MEMORY=2G
    

    No need to use double quotes for values.

    0 讨论(0)
  • 2020-12-30 16:02

    The spark-env.sh is a regular bash script intended for Unix, so on a Windows installation it will never get picked up.

    On Windows, you'll need to have a spark-env.cmd file in the conf directory and instead use the following syntax :

    set SPARK_EXECUTOR_MEMORY=2G
    

    On Unix, the file will be called spark-env.sh and you will need to preprend each of your properties with export (e.g. : export SPARK_EXECUTOR_MEMORY=2G)

    0 讨论(0)
提交回复
热议问题