amazon-s3

Failing to create s3 buckets in specific regions

喜夏-厌秋 提交于 2020-08-09 09:14:30
问题 I'm trying to create an s3 bucket in every region in AWS with boto3 in python but I'm failing to create a bucket in 4 regions (af-south-1, eu-south-1, ap-east-1 & me-south-1) My python code: def create_bucket(name, region): s3 = boto3.client('s3') s3.create_bucket(Bucket=name, CreateBucketConfiguration={'LocationConstraint': region}) and the exception I get: botocore.exceptions.ClientError: An error occurred (InvalidLocationConstraint) when calling the CreateBucket operation: The specified

How to make Nuxt generated static spa files in dist hosted on AWS S3

梦想的初衷 提交于 2020-08-09 08:57:09
问题 How to host Nuxt Static web app in AWS S3? Firstly, I have tried and known how to generate static static SPA files in ./dist by calling nuxt generate . Secondly, AWS S3 supports static web hosting, but it seems that the site have to be accessed by visiting the 'index.html' in that bucket. So, I came to this problem, for example, I got a bucket 'demo2020', and I upload Nuxt/Vue files in the ./dist into that. I have also set the bucket and files in in public. After these, I can visit images in

How to use a pretrained model from s3 to predict some data?

送分小仙女□ 提交于 2020-08-09 05:41:05
问题 I have trained a semantic segmentation model using the sagemaker and the out has been saved to a s3 bucket. I want to load this model from the s3 to predict some images in sagemaker. I know how to predict if I leave the notebook instance running after the training as its just an easy deploy but doesn't really help if I want to use an older model. I have looked at these sources and been able to come up with something myself but it doesn't work hence me being here: https://course.fast.ai

How to synchronously upload files to S3 using aws-sdk?

一个人想着一个人 提交于 2020-08-08 18:16:31
问题 I'm attempting to upload files to my S3 bucket and then return out of my upload function. The problem is that I'm returning out of the function before the upload returns the stored data. I've attempted to use async/await with s3.upload , but I don't believe s3.upload is a promise so it doesn't do anything. ex: for (const file of files) { const params = { Bucket: BUCKET_NAME, Key: file.name, Body: file.data }; const stored = await s3.upload(params, (err, data) => { if (err) console.log("error"

How to insert from csv to dynamodb

守給你的承諾、 提交于 2020-08-08 08:18:34
问题 Dynamo db have one table employees primary key as id data.csv is below which uploaded in the csvdynamo bucket bucket_name = csvdynamo id,name,co 20,AB,PC 21,CD,PC 22,EF,MC 23,GH,MC Need to insert above csv into dynamodb psuedo code for emp in employees: emp_data= emp.split(',') print (emp_data) try: table.put_item( Item = { "emp_id": int(emp_data[0]), "Name": emp_data[1], "Company": emp_data[2] } ) except Exception as e: pass 回答1: Here is an example of a lambda function which works , as I

Rails API ActiveStorage: Get Public URL to display image from AWS S3 Bucket?

我只是一个虾纸丫 提交于 2020-08-08 06:45:26
问题 I have a Rails 5.2 API set up and have followed the documentation on how to attach images to a model object - that's all working fine. The problem I'm having is I want to return in a JSON object the attachment's public URL so that I can use that URL as the source in an <img src... in my React front end. Is there a way to return the actual URL from the AWS S3 bucket, where the image would show up if clicked on? Right now, I've tried rails_blob_path , service_url , and I do get URLs in return,

How to deal with large WAR files in AWS?

China☆狼群 提交于 2020-08-07 05:10:45
问题 What I have done ? Developed a Web app using JSP which allows user to register , login and upload file to AWS S3. I am deploying this app to aws using Elastic Beanstalk by uploading war file of app. For login and register modules I have used RDS and it is working fine. Problem When I want to upload files to S3 , I need to use AWS SDK jar and its supporting JAR files in WEB app. When I finished with development part and exported war file , it was around 75 MB. So problem is that if I change

How to deal with large WAR files in AWS?

我与影子孤独终老i 提交于 2020-08-07 05:10:11
问题 What I have done ? Developed a Web app using JSP which allows user to register , login and upload file to AWS S3. I am deploying this app to aws using Elastic Beanstalk by uploading war file of app. For login and register modules I have used RDS and it is working fine. Problem When I want to upload files to S3 , I need to use AWS SDK jar and its supporting JAR files in WEB app. When I finished with development part and exported war file , it was around 75 MB. So problem is that if I change

Copy S3 Bucket including versions

女生的网名这么多〃 提交于 2020-08-06 07:56:10
问题 Is there a way to copy an S3 bucket including the versions of objects? I read that a way to copy a bucket is by using the command line tool with aws s3 sync s3://<source> s3://<dest> However, on in the source bucket I had: while in the synced bucket I have: As you can see the Version ID is "null". Is there a way to make a 100% identical copy, including the version ID? This would be important for our backups / development server, as our app is relying on the version ID. Edit: If I turn on

Copy S3 Bucket including versions

和自甴很熟 提交于 2020-08-06 07:56:03
问题 Is there a way to copy an S3 bucket including the versions of objects? I read that a way to copy a bucket is by using the command line tool with aws s3 sync s3://<source> s3://<dest> However, on in the source bucket I had: while in the synced bucket I have: As you can see the Version ID is "null". Is there a way to make a 100% identical copy, including the version ID? This would be important for our backups / development server, as our app is relying on the version ID. Edit: If I turn on