Spark is inventing his own AWS secretKey
问题 I'm trying to read a s3 bucket from Spark and up until today Spark always complain that the request return 403 hadoopConf = spark_context._jsc.hadoopConfiguration() hadoopConf.set("fs.s3a.access.key", "ACCESSKEY") hadoopConf.set("fs.s3a.secret.key", "SECRETKEY") hadoopConf.set("fs.s3a.impl", "org.apache.hadoop.fs.s3a.S3AFileSystem") logs = spark_context.textFile("s3a://mybucket/logs/*) Spark was saying .... Invalid Access key [ACCESSKEY] However with the same ACCESSKEY and SECRETKEY this was