Unable to connect Spark to Cassandra DB in RStudio

回眸只為那壹抹淺笑 提交于 2020-01-26 00:39:05

问题


I've spent the last week trying to figure out how to use sparlyr to get spark to connect to cassandra on our local cluster, and I've hit a wall - any help would be greatly appreciated. I'm the only one trying to use R/Rstudio to make this connection (everyone else uses Java on NetBeans and Maven), and am not sure what I need to do to make this work.

The stack I'm using is: Ubuntu 16.04 (in a VM) sparklyr: 0.5.3 Spark: 2.0.0 Scala: 2.11 Cassandra: 3.7

relevant config.yml file settings:

# cassandra settings
spark.cassandra.connection.host: <cluster_address>
spark.cassandra.auth.username: <user_name>
spark.cassandra.auth.password: <password>

sparklyr.defaultPackages:
- com.databricks:spark-csv_2.11:1.3.0
- com.datastax.spark:spark-cassandra-connector_2.11:2.0.0-M1
- com.datastax.cassandra:cassandra-driver-core:3.0.2

Sys.setnev setting set for local install of Java and spark, config set to use yml file. Spark connection initiated with:

sc <- spark_connect(master = "spark://<cluster_address>", config = spark_config(file = "config.yml"))

Spark session initiated with:

sparkSession <- sparklyr::invoke_static(sc, org.apache.spark.sql.SparkSession", "builder") %>% 
    sparklyr::invoke("config", "spark.cassandra.connection.host", "<cluster_address>") %>% 
    sparklyr::invoke("getOrCreate")

It all seems fine up to here, (sc connection and sparkSession), but now attempting to access a cassandra table (table_1 in in keyspace_1), which I know exists:

cass_df <- invoke(sparkSession, "read") %>% 
invoke("format", "org.apache.spark.sql.cassandra") %>% 
invoke("option", "keyspace", "keyspace_1") %>% 
invoke("option", "table", "table_1") %>% 
invoke("load")

throws up the following error:

Error: java.lang.IllegalArgumentException: Cannot build a cluster without contact points
at com.datastax.driver.core.Cluster.checkNotEmpty(Cluster.java:123)
at com.datastax.driver.core.Cluster.(Cluster.java:116)
at com.datastax.driver.core.Cluster.buildFrom(Cluster.java:182)
at com.datastax.driver.core.Cluster$Builder.build(Cluster.java:1274)
at com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:92) . . .

回答1:


finally solved it, thanks to a useful tip.I was using the SPARK (with port number) to initialise the SparkSession rather than just the cluster address (where cassandra was located). it works! thanks @user7337271.



来源:https://stackoverflow.com/questions/41877859/unable-to-connect-spark-to-cassandra-db-in-rstudio

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!