How to share Spark RDD between 2 Spark contexts?
I have an RMI cluster. Each RMI server has a Spark context. Is there any way to share an RDD between different Spark contexts? As already stated by Daniel Darabos it is not possible. Every distributed object in Spark is bounded to specific context which has been used to create it ( SparkContext in case of RDD, SQLContext in case of DataFrame dataset). If you want share objects between applications you have to use shared contexts (see for example spark-jobserver , Livy , or Apache Zeppelin ). Since RDD or DataFrame is just a small local object there is really not much to share. Sharing data is