AWS_ACCESS_KEY_ID = \'\'
AWS_SECRET_ACCESS_KEY = \'\'
Bucketname = \'Bucket-name\'
import boto
from boto.s3.key import Key
im
The question is answered, but I wanted to include some additional info that helped me. Keep in mind latest boto is boto3, but I was stuck using Python 2.7 in a legacy environment.
Authentication
There are at least 3 ways to authenticate with boto: First, you can include credentials (access key, secret key) in the connect_to_region() call. A second way is to define the environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY and then don't supply credentials in the connect_to_region() call. Finally, if using boto 2.5.1 or later, boto can use the IAM role for an instance to create temporary credentials.
For the first two, you need to use AWS console to create a user with access to a bucket. In the third method, create an IAM role with access to the bucket and assign it to the instance. The 3rd way is often the best because then you don't have to store credentials in source control, or manage credentials in the environment.
Accessing the Bucket
Now on to the mistake I made that caused the same message as the OP. The top level objects in S3 are buckets and everything below are keys. In my case the object I wanted to access was at s3:top-level/next-level/object. I tried to access it like this:
bucket = conn.get_bucket('top-level/next-level')
The point is that next-level is not a bucket but a key, and you'll get the "Name or service not known" message if the bucket doesn't exist.