Share SparkContext between Java and R Apps under the same Master

后端 未结 1 1207
悲&欢浪女
悲&欢浪女 2021-01-06 12:22

So here is the setup.

Currently I have two Spark Applications initialized. I need to pass data between them (preferably through shared sparkcontext/sqlcontext so I c

相关标签:
1条回答
  • 2021-01-06 12:45

    As far as I know it is not possible given your current configuration. Tables created using registerTempTable are bound to the specific SQLContext which has been used to create corresponding DataFrame. Even if your Java and SparkR applications use the same master their drivers run on separate JVMs and cannot share single SQLContext.

    There are tools, like Apache Zeppelin, which take a different approach with a single SQLContext (and SparkContext) which is exposed to individual backends. This way you can register table using for example Scala and read it from Python. There is a fork of Zeppelin which provides some support for SparkR and R. You can check how it starts and interacts R backend.

    0 讨论(0)
提交回复
热议问题