Unable to connect aws s3 bucket using boto

前端 未结 6 998
忘掉有多难
忘掉有多难 2020-12-14 11:15
AWS_ACCESS_KEY_ID = \'\'
AWS_SECRET_ACCESS_KEY = \'\'
Bucketname = \'Bucket-name\' 
import boto
from boto.s3.key import Key
im         


        
相关标签:
6条回答
  • 2020-12-14 11:27

    You can also use the following (boto.s3.connect_to_region):

    import boto
    from boto.s3.key import Key
    import boto.s3.connection
    
    AWS_ACCESS_KEY_ID = '<access key>'
    AWS_SECRET_ACCESS_KEY = '<my secret key>'
    Bucketname = 'Bucket-name' 
    
    
    conn = boto.s3.connect_to_region('ap-southeast-1',
           aws_access_key_id=AWS_ACCESS_KEY_ID,
           aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
           is_secure=True,               # uncomment if you are not using ssl
           calling_format = boto.s3.connection.OrdinaryCallingFormat(),
           )
    bucket = conn.get_bucket(Bucketname)
    

    This way you don't have to care about the 'exact' endpoint with the full hostname. And yes like @garnaat mentioned, use the latest boto API.

    0 讨论(0)
  • 2020-12-14 11:37

    There is a typo in the host parameter. The right one is: s3-ap-southeast-1.amazonaws.com

    REFERENCES Amazon Regions and Endpoints

    0 讨论(0)
  • 2020-12-14 11:38

    The question is answered, but I wanted to include some additional info that helped me. Keep in mind latest boto is boto3, but I was stuck using Python 2.7 in a legacy environment.

    Authentication

    There are at least 3 ways to authenticate with boto: First, you can include credentials (access key, secret key) in the connect_to_region() call. A second way is to define the environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY and then don't supply credentials in the connect_to_region() call. Finally, if using boto 2.5.1 or later, boto can use the IAM role for an instance to create temporary credentials.

    For the first two, you need to use AWS console to create a user with access to a bucket. In the third method, create an IAM role with access to the bucket and assign it to the instance. The 3rd way is often the best because then you don't have to store credentials in source control, or manage credentials in the environment.

    Accessing the Bucket

    Now on to the mistake I made that caused the same message as the OP. The top level objects in S3 are buckets and everything below are keys. In my case the object I wanted to access was at s3:top-level/next-level/object. I tried to access it like this:

    bucket = conn.get_bucket('top-level/next-level')
    

    The point is that next-level is not a bucket but a key, and you'll get the "Name or service not known" message if the bucket doesn't exist.

    0 讨论(0)
  • 2020-12-14 11:40

    Gotcha: capture traffic on your Ethernet link and ensure CNAME in DNS queries do NOT contain '\r' character e.g. in the bucket name.

    0 讨论(0)
  • 2020-12-14 11:41

    The request to the host s3.ap-southeast-1.amazonaws.com is failing. I also cannot resolve it from my end. Check your bucket settings for the correct host.

    There might also be a problem with your internet connection or the DNS server. Try pinging the host manually from command line and see if it resolves. Alternatively, try using a different DNS.

    Edit: Quick googling suggests that the host might be s3-ap-southeast-1.amazonaws.com.

    0 讨论(0)
  • 2020-12-14 11:47
    from boto3.session import Session
    
    ACCESS_KEY='your_access_key'
    
    SECRET_KEY='your_secret_key'
    
    session = Session(aws_access_key_id=ACCESS_KEY,aws_secret_access_key=SECRET_KEY)
    
    s3 = session.resource('s3')
    
    my_bucket = s3.Bucket('bucket_name')
    
    for s3_file in my_bucket.objects.all():
    
               print(s3_file.key)
    
    0 讨论(0)
提交回复
热议问题