How to pre-package external libraries when using Spark on a Mesos cluster

前端 未结 4 1958
無奈伤痛
無奈伤痛 2020-12-10 05:30

According to the Spark on Mesos docs one needs to set the spark.executor.uri pointing to a Spark distribution:

val conf = new SparkConf()
  .set         


        
4条回答
  •  被撕碎了的回忆
    2020-12-10 05:56

    After I discovered the Spark JobServer project, I decided that this is the most suitable one for my use case.

    It supports dynamic context creation via a REST API, as well as adding JARs to the newly created context manually/programmatically. It also is capable of runnign low-latency synchronous jobs, which is exactly what I need.

    I created a Dockerfile so you can try it out with the most recent (supported) versions of Spark (1.4.1), Spark JobServer (0.6.0) and buit-in Mesos support (0.24.1):

    • https://github.com/tobilg/docker-spark-jobserver
    • https://hub.docker.com/r/tobilg/spark-jobserver/

    References:

    • https://github.com/spark-jobserver/spark-jobserver#features
    • https://github.com/spark-jobserver/spark-jobserver#context-configuration

提交回复
热议问题