how to use Cassandra Context in spark 2.0
问题 In previous Version of Spark like 1.6.1, i am using creating Cassandra Context using spark Context, import org.apache.spark.{ Logging, SparkContext, SparkConf } //config val conf: org.apache.spark.SparkConf = new SparkConf(true) .set("spark.cassandra.connection.host", CassandraHost) .setAppName(getClass.getSimpleName) lazy val sc = new SparkContext(conf) val cassandraSqlCtx: org.apache.spark.sql.cassandra.CassandraSQLContext = new CassandraSQLContext(sc) //Query using Cassandra context