how to use Cassandra Context in spark 2.0

孤人 提交于 2019-12-02 08:17:48

问题


In previous Version of Spark like 1.6.1, i am using creating Cassandra Context using spark Context,

import org.apache.spark.{ Logging, SparkContext, SparkConf }
//config
val conf: org.apache.spark.SparkConf = new SparkConf(true)
.set("spark.cassandra.connection.host", CassandraHost)
.setAppName(getClass.getSimpleName)
 lazy val sc = new SparkContext(conf)
 val cassandraSqlCtx: org.apache.spark.sql.cassandra.CassandraSQLContext = new CassandraSQLContext(sc)
//Query using Cassandra context
  cassandraSqlCtx.sql("select id from table ")

But In Spark 2.0 , Spark Context is replaced with Spark session, how can i use cassandra context?


回答1:


Short Answer: You don't. It has been deprecated and removed.

Long Answer: You don't want to. The HiveContext provides everything except for the catalogue and supports a much wider range of SQL(HQL~). In Spark 2.0 this just means you will need to manually register Cassandra tables use createOrReplaceTempView until an ExternalCatalogue is implemented.

In Sql this looks like

spark.sql("""CREATE TEMPORARY TABLE words
     |USING org.apache.spark.sql.cassandra
     |OPTIONS (
     |  table "words",
     |  keyspace "test")""".stripMargin)

In the raw DF api it looks like

spark
 .read
 .format("org.apache.spark.sql.cassandra")
 .options(Map("keyspace" -> "test", "table" -> "words"))
 .load
 .createOrReplaceTempView("words")

Both of these commands will register the table "words" for SQL queries.



来源:https://stackoverflow.com/questions/39423131/how-to-use-cassandra-context-in-spark-2-0

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!