R Shiny and Spark: how to free Spark resources?

痞子三分冷 提交于 2019-11-29 15:24:29

TL;DR SparkSession and SparkContext are not lightweight resources which can be started on demand.

Putting aside all security considerations related to starting Spark session directly from a user-facing application, maintaining SparkSession inside server (starting session on entry, stopping on exit) is simply not a viable option.

server function will be executed every time there is an upcoming event effectively restarting a whole Spark application, and rendering project unusable. And this only the tip of the iceberg. Since Spark reuses existing sessions (only one context is allowed for a single JVM), multiuser access could lead to random failures if reused session has been stopped from another server call.

One possible solution is to register onSessionEnded with spark_disconnect, but I am pretty sure it will be useful only in a single user environment.

Another possible approach is to use global connection, and wrap runApp with function calling spark_disconnect_all on exit:

runApp <- function() {
  shiny::runApp()
  on.exit({
    spark_disconnect_all()
  })
}

although in practice resource manager should free resources when driver disassociates, without stopping session explicitly.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!