How to load Spark Cassandra Connector in the shell?

前端 未结 6 1443
长情又很酷
长情又很酷 2020-12-07 16:16

I am trying to use Spark Cassandra Connector in Spark 1.1.0.

I have successfully built the jar file from the master branch on GitHub and have gotten the included dem

6条回答
  •  感动是毒
    2020-12-07 16:39

    I got it. Below is what I did:

    $ git clone https://github.com/datastax/spark-cassandra-connector.git
    $ cd spark-cassandra-connector
    $ sbt/sbt assembly
    $ $SPARK_HOME/bin/spark-shell --jars ~/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/connector-assembly-1.2.0-SNAPSHOT.jar 
    

    In scala prompt,

    scala> sc.stop
    scala> import com.datastax.spark.connector._
    scala> import org.apache.spark.SparkContext
    scala> import org.apache.spark.SparkContext._
    scala> import org.apache.spark.SparkConf
    scala> val conf = new SparkConf(true).set("spark.cassandra.connection.host", "my cassandra host")
    scala> val sc = new SparkContext("spark://spark host:7077", "test", conf)
    

提交回复
热议问题