Spark is inventing his own AWS secretKey

前端 未结 3 1377
攒了一身酷
攒了一身酷 2020-12-19 12:58

I\'m trying to read a s3 bucket from Spark and up until today Spark always complain that the request return 403

hadoopConf = spark_context._jsc.hadoopConfigu         


        
3条回答
  •  粉色の甜心
    2020-12-19 13:22

    I ran into a similar issue. Requests that were using valid AWS credentials returned a 403 Forbidden, but only on certain machines. Eventually I found out that the system time on those particular machines were 10 minutes behind. Synchronizing the system clock solved the problem.

    Hope this helps!

提交回复
热议问题