Locally reading S3 files through Spark (or better: pyspark)
问题 I want to read an S3 file from my (local) machine, through Spark (pyspark, really). Now, I keep getting authentication errors like java.lang.IllegalArgumentException: AWS Access Key ID and Secret Access Key must be specified as the username or password (respectively) of a s3n URL, or by setting the fs.s3n.awsAccessKeyId or fs.s3n.awsSecretAccessKey properties (respectively). I looked everywhere here and on the web, tried many things, but apparently S3 has been changing over the last year or