Spark + s3 - error - java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.s3a.S3AFileSystem not found

前端 未结 4 1177
抹茶落季
抹茶落季 2020-12-20 14:35

I have a spark ec2 cluster where I am submitting a pyspark program from a Zeppelin notebook. I have loaded the hadoop-aws-2.7.3.jar and aws-java-sdk-1.11.179.jar and place

4条回答
  •  孤城傲影
    2020-12-20 14:59

    If nothing works in the above then do a cat and grep for the missing class. High possibility that the Jar is corrupted. For example, if you get class AmazonServiceException not found, then do a grep where the jar is already present as shown below.

    grep "AmazonServiceException" *.jar

提交回复
热议问题