问题
I am trying to run my first spark job (a Scala job that accesses Cassandra) which is failing and showing the following error :
java.io.IOException: Failed to open native connection to Cassandra at {<ip>}:9042
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:164)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
...........
............
Caused by: java.lang.IllegalArgumentException: Contact points contain multiple data centers:
at com.datastax.spark.connector.cql.LocalNodeFirstLoadBalancingPolicy.init(LocalNodeFirstLoadBalancingPolicy.scala:47)
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1099)
at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:271)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:157)
What are we doing wrong here?
I am using :
- Spark 1.5.2
- Apache Cassandra 2.1.10
- spark-cassandra connector 1.3.1 /1.5.0-M2 (tried both connectors)
- Scala version 2.10.4
回答1:
--> According to author there a work in progress to fix this. See comments below this answer.
I found this in the documentation, I hope it will help you :
override def init(cluster: Cluster, hosts: JCollection[Host]) {
nodes = hosts.toSet
// use explicitly set DC if available, otherwise see if all contact points have same DC
// if so, use that DC; if not, throw an error
dcToUse = localDC match {
case Some(local) => local
case None =>
val dcList = dcs(nodesInTheSameDC(contactPoints, hosts.toSet))
if (dcList.size == 1)
dcList.head
else
throw new IllegalArgumentException(s"Contact points contain multiple data centers: ${dcList.mkString(", ")}")
}
clusterMetadata = cluster.getMetadata
}
回答2:
I was facing the same issue while trying to connect two Cassandra data center using Apache Spark 2.x.x.
public class SparkCassandraTest {
private static final String CASSANDRA_ENDPOINTS = "DC1_node1,DC1_node2,DC1_node3,DC2_node1,DC2_node2,DC2_node3";
public static void main(String[] args) {
sparkConf = new SparkConf().setAppName(APP_NAME);
sparkConf.set("spark.cassandra.connection.host", CASSANDRA_ENDPOINTS);
sparkConf.set("spark.cassandra.auth.username", CASSANDRA_USERNAME);
sparkConf.set("spark.cassandra.auth.password", CASSANDRA_PASSWORD);
sparkSession = SparkSession.builder().config(sparkConf).enableHiveSupport().getOrCreate();
//.....................
//.....................
//.....................
}
}
Caused by: java.lang.IllegalArgumentException: requirement failed: Contact points contain multiple data centers: DC2-XXXXX2, DC1-XXXXX1
I resolve this issue by connecting any one Cassandra data center (DC1_node1,DC1_node2,DC1_node3) or (DC2_node1,DC2_node2,DC2_node3).
来源:https://stackoverflow.com/questions/34004959/cannot-connect-to-cassandra-from-spark-contact-points-contain-multiple-data-cen