Invalid hostname error when connecting to s3 sink when using secret key having forward slash

牧云@^-^@ 提交于 2019-12-05 14:21:34

samthebest solution works, you just have to add "" surrounding the keys. Here how to use it:

hadoop distcp -Dfs.s3a.awsAccessKeyId="yourkey" -Dfs.s3a.awsSecretAccessKey="yoursecret" <your_hdfs_path> s3a://<your-bucket>

I end up creating a new secret key without forward slashes. This is a know issue and generating new key is only solution.

Use

-Dfs.s3n.awsAccessKeyId=<your-key> -Dfs.s3n.awsSecretAccessKey=<your-secret-key>

e.g.

hadoop distcp -Dfs.s3n.awsAccessKeyId=<your-key> -Dfs.s3n.awsSecretAccessKey=<your-secret-key> -<subsubcommand> <args>

or

hadoop fs -Dfs.s3n.awsAccessKeyId=<your-key> -Dfs.s3n.awsSecretAccessKey=<your-secret-key> -<subsubcommand> <args>
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!