问题
Using Spark 1.5 Streaming with an Actor receiver.
val conf = new SparkConf()
.setMaster("local[4]")
.setAppName("ModelTest")
val ssc = new StreamingContext(conf, Seconds(2))
val models = ssc.actorStream[Model](Props(...), "ModelReceiver")
models.foreachRDD { rdd => ... }
ssc.start()
ssc.awaitTermination()
// NEVER GETS HERE!
When the generated Actor is shutdown the code will not progress beyond ssc.awaitTermination()
If I kill SBT with Ctrl+C
a println
after the ssc.awaitTermination()
line will complete.
How should Spark be terminated?
回答1:
You are correct that Spark Streaming will await termination, as the function name hints. To kill a Streaming application you send a SIGTERM to that process, for example by using the kill command.
As you can also see in the Spark Standalone documentation you can also kill the process using Spark Submit:
./bin/spark-class org.apache.spark.deploy.Client kill <master url> <driver ID>
You can define some code that you want to run when the process is shutting down, by calling sys.ShutdownHookThread
.
sys.ShutdownHookThread {
log.info("Stopping Spark Streaming...")
ssc.stop(stopSparkContext = true, stopGracefully = true)
log.info("Shutting down the application...")
}
来源:https://stackoverflow.com/questions/32810478/spark-streaming-with-actor-never-terminates