amazon-s3

How to rename AWS Athena columns with parquet file source?

|▌冷眼眸甩不掉的悲伤 提交于 2021-01-28 09:28:59
问题 I have data loaded in my S3 bucket folder as multiple parquet files. After loading them into Athena I can query the data successfully. What are the ways to rename the Athena table columns for parquet file source and still be able to see the data under renamed column after querying? Note: checked with edit schema option, column is getting renamed but after querying you will not see data under that column. 回答1: There is as far as I know no way to create a table with different names for the

After deploy taking time to reflect changes

馋奶兔 提交于 2021-01-28 09:05:53
问题 I have a project which is built-in ReactJs. and I am using s3 and CloudFront for deployment. I am facing an issue whenever I deploy code after deployment it takes too much time to reflect changes. sometimes I have to manually clear browser cookies for the latest changes. Do I need to configure S3 or CloudFront settings? 回答1: Follow this steps: Go to cloudfront : Do invalidation of objects Create entry /* Reference : https://aws.amazon.com/premiumsupport/knowledge-center/cloudfront-serving

React JS : Upload CSV file To Amazon S3 using AWS credentials

别来无恙 提交于 2021-01-28 08:30:35
问题 I want to upload CSV file using react S3 Uploader. I have AWS credentials as well. But I don't know how to use it in React JS. Below is the code I have used. import React, { PureComponent } from "react"; import ReactS3Uploader from "react-s3-uploader"; saveUploaderReference = uploader => { if (uploader) { this.uploaderReference = uploader; } }; getSignedUrl = (file) => { console.log("File : " , file) }; <ReactS3Uploader ref={this.saveUploaderReference} getSignedUrl={this.getSignedUrl} s3path=

How to change s3 bucket policies with cloudformation?

空扰寡人 提交于 2021-01-28 08:25:11
问题 I would like to be able to change the policies on s3 buckets using cloudformation. However when I attempt to do this I encounter the error: 2017-12-21 18:49:10 UTC TestBucketpolicyAwsS3Bucketpolicy CREATE_FAILED API: s3:PutBucketPolicy Access Denied Here is an example of a cloudformation template that fails due to this issue: { "AWSTemplateFormatVersion": "2010-09-09", "Description": "", "Resources": { "TestBucketpolicyAwsS3Bucketpolicy": { "Type": "AWS::S3::BucketPolicy", "Properties": {

Troubles using Amazon S3 with Europe based buckets in django-storages

爷,独闯天下 提交于 2021-01-28 07:34:50
问题 Question is regarding Amazon S3 buckets in EU. I have a Django project hosted locally and decided to store media files in Amazon S3 buckets. For doing this I use Django-storages app with the following settings in settings.py : AWS_ACCESS_KEY_ID = "xxxxxxx" AWS_SECRET_ACCESS_KEY = "xxxxxxxxxx" AWS_STORAGE_BUCKET_NAME = 'hadcasetest' DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage' MEDIA_URL = 'http://' + AWS_STORAGE_BUCKET_NAME + '.s3.amazonaws.com/' THUMBNAIL_DEFAULT_STORAGE =

Axios get a file from URL and upload to s3

和自甴很熟 提交于 2021-01-28 07:19:30
问题 I'm trying to get files from a site using axios.get, and then uploading it directly to S3. However, the files are corrupted or not encoded properly, and can't be opened after upload. File types range from .jpg, .png to .pdf. Here is my code: axios.get(URL, { responseEncoding: 'binary', responseType: 'document', }).then((response) => { return new Promise((resolve, reject) => { const s3Bucket = nconf.get('AWS_S3_BUCKET'); s3.upload({ 'ACL': 'public-read', 'Body': response.data, 'Bucket':

AWS S3 check if file exists based on a conditional path

六月ゝ 毕业季﹏ 提交于 2021-01-28 07:13:56
问题 I would like to check if a file exists in a separate directory of the bucket if a given file exists. I have the following directory structure- import boto3 s3 = boto3.resource('s3') def file_exists(fileN): try: s3.Object('my-bucket', 'folder1/folder2/'+fileN).load() except: return False else: fileN = fileN.split(".")[0] try: s3.Object('my-bucket', 'folder1/<randomid folderxxxx>/'+fileN+'_condition.jpg').load() except: return False else: return True file_exists("test.jpg") This works but as

S3 presigned upload url error

随声附和 提交于 2021-01-28 07:00:37
问题 I am trying to perform a document upload using an S3 pre-signed PUT url. I generated the url using java AWS SDK ( GeneratePresignedUrlRequest.java ). This url generation code sits in a lambda function behind AWS API gateway. However I am getting the following error when I copy the generated url in Postman & try to perform an upload. <Error> <Code>AccessDenied</Code> <Message> There were headers present in the request which were not signed </Message> <HeadersNotSigned>host</HeadersNotSigned>

Beam: Failed to serialize and deserialize property 'awsCredentialsProvider

二次信任 提交于 2021-01-28 06:37:11
问题 I have been using a Beam pipeline examples as a guide in an attempt to load files from S3 for my pipeline. Like in the examples I have defined my own PipelineOptions that also extends S3Options and I am attempting to use the DefaultAWSCredentialsProviderChain. The code to configure this is: MyPipelineOptions options = PipelineOptionsFactory.fromArgs(args).as(MyPipelineOptions.class); options.setAwsCredentialsProvider(new DefaultAWSCredentialsProviderChain()); options.setAwsRegion("us-east-1")

Concatenate files on S3

ε祈祈猫儿з 提交于 2021-01-28 05:45:05
问题 We are getting several files in one s3 folder ( 130K files , combined size is 2GB ). Each file has Json data , could be one or many records. I need to merge these files into a single Json file and store it on s3. I don't want to download the files to local machine and then combine. Is there a way to do it using AWS SDK for Java ? 回答1: The simplest way to achieve this would be to use Amazon Athena to read and combine the files. Athena is a managed query service based on Presto that can read