I\'m working in a Python web environment and I can simply upload a file from the filesystem to S3 using boto\'s key.set_contents_from_filename(path/to/file). However, I\'d l
Ok, from @garnaat, it doesn't sound like S3 currently allows uploads by url. I managed to upload remote images to S3 by reading them into memory only. This works.
def upload(url):
    try:
        conn = boto.connect_s3(settings.AWS_ACCESS_KEY_ID, settings.AWS_SECRET_ACCESS_KEY)
        bucket_name = settings.AWS_STORAGE_BUCKET_NAME
        bucket = conn.get_bucket(bucket_name)
        k = Key(bucket)
        k.key = url.split('/')[::-1][0]    # In my situation, ids at the end are unique
        file_object = urllib2.urlopen(url)           # 'Like' a file object
        fp = StringIO.StringIO(file_object.read())   # Wrap object    
        k.set_contents_from_file(fp)
        return "Success"
    except Exception, e:
        return e
Also thanks to How can I create a GzipFile instance from the “file-like object” that urllib.urlopen() returns?