Running Spark driver program in Docker container - no connection back from executor to the driver?

前端 未结 3 545
北恋
北恋 2020-12-30 13:17

UPDATE: The problem is resolved. The Docker image is here: docker-spark-submit

I run spark-submit with a fat jar inside a Docker container. My stand

3条回答
  •  我在风中等你
    2020-12-30 13:39

    My settings, with Docker and MacOS:

    • Run Spark 1.6.3 master + worker inside the same Docker container
    • Run Java app from MacOS (via IDE)

    Docker-compose opens ports:

    ports:
    - 7077:7077
    - 20002:20002
    - 6060:6060
    

    Java config (for dev purpose):

            esSparkConf.setMaster("spark://127.0.0.1:7077");
            esSparkConf.setAppName("datahub_dev");
    
            esSparkConf.setIfMissing("spark.driver.port", "20002");
            esSparkConf.setIfMissing("spark.driver.host", "MAC_OS_LAN_IP");
            esSparkConf.setIfMissing("spark.driver.bindAddress", "0.0.0.0");
            esSparkConf.setIfMissing("spark.blockManager.port", "6060");
    

提交回复
热议问题