spark1.4.1配置及源码阅读
1.创建脚本 cd /opt/spark-1.4.1-bin-hadoop2.6/conf cp spark-env.sh.template spark-env.sh cp slaves.template slaves 2.程序加入环境变量 vi spark-env.sh export JAVA_HOME=/opt/jdk1.7.0_75 export SCALA_HOME=/opt/scala-2.11.6 export HADOOP_CONF_DIR=/opt/hadoop-2.6.0/etc/hadoop # spark的work目录临时文件自动清理,清理频率每半小时 export SPARK_WORKER_DIR="/home/hadoop/spark/worker/" export SPARK_WORKER_OPTS="-Dspark.worker.cleanup.enabled=true -Dspark.worker.cleanup.interval=1800" vi slaves 填入各节点hostname 3.系统加入环境变量 vi /etc/profile export SPARK_HOME=/opt/spark-1.4.1-bin-hadoop2.6 export PATH=$SPARK_HOME/bin:$PATH 4.启动 cd ../sbin/ .