Spark and Cassandra Java application: Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/spark/sql/Dataset

前提是你 提交于 2019-12-01 14:23:55

You are getting "java.lang.NoClassDefFoundError: org/apache/spark/sql/Dataset" error because "spark-sql" dependency is missing from your pom.xml file.

If you want to read Cassandra table with Spark 2.0.0 then you need below minimum dependencies.

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.0.0</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.11</artifactId>
    <version>2.0.0</version>
</dependency>
<dependency>
    <groupId>com.datastax.spark</groupId>
    <artifactId>spark-cassandra-connector_2.11</artifactId>
    <version>2.0.0-M3</version>
</dependency>

Spark 2.0.0 provides SparkSession and Dataset API. Below is the sample program for reading Cassandra table and printing the records.

 public class SparkCassandraDatasetApplication {
 public static void main(String[] args) {
 SparkSession spark = SparkSession
          .builder()
          .appName("SparkCassandraDatasetApplication")
          .config("spark.sql.warehouse.dir", "/file:C:/temp")
          .config("spark.cassandra.connection.host", "127.0.0.1")
          .config("spark.cassandra.connection.port", "9042")
          .master("local[2]")
          .getOrCreate();

 //Read data
 Dataset<Row> dataset = spark.read().format("org.apache.spark.sql.cassandra")
        .options(new HashMap<String, String>() {
            {
                put("keyspace", "mykeyspace");
                put("table", "mytable");
            }
        }).load();

   //Print data
   dataset.show();       
   spark.stop();
   }        
}

If you still want to use RDD then use below sample program.

public class SparkCassandraRDDApplication {
public static void main(String[] args) {
    SparkConf conf = new SparkConf()
            .setAppName("SparkCassandraRDDApplication")
            .setMaster("local[2]")
            .set("spark.cassandra.connection.host", "127.0.0.1")
            .set("spark.cassandra.connection.port", "9042");

    JavaSparkContext sc = new JavaSparkContext(conf);

    //Read
    JavaRDD<UserData> resultsRDD = javaFunctions(sc).cassandraTable("mykeyspace", "mytable",CassandraJavaUtil.mapRowTo(UserData.class));

    //Print
    resultsRDD.foreach(data -> {
        System.out.println(data.id);
        System.out.println(data.username);
    });

    sc.stop();
  }
}

Javabean (UserData) used in above program is like below.

public class UserData implements Serializable{  
  String id;
  String username;     
  public String getId() {
      return id;
  }
  public void setId(String id) {
      this.id = id;
  }
  public String getUsername() {
     return username;
  }
  public void setUsername(String username) {
     this.username = username;
   }    
}

I think you need to ensure that following resources is present in your class path:

cassandra-driver-core-2.1.0.jar
metrics-core-3.0.2.jar
slf4j-api-1.7.5.jar
netty-3.9.0-Final.jar
guava-16.0.1.jar

Hope this will help you

Remove

<!-- https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector-java_2.10 -->
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector-java_2.10</artifactId>
<version>1.6.0-M1</version>
</dependency>

You are mixing versions on the classpath. The java module is included within the core module in Spark Cassandra Connector 2.0.0. So this just pulls in spark 1.6 references.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!