amazon-s3

Unable To Parse .csv in Python

雨燕双飞 提交于 2020-01-06 04:54:25
问题 I am doing a lab on the website LinuxAcademy.com. The Course name is Automating AWS with Lambda, Python, and Boto3 and the specific lab I am having trouble with is Lecture: Importing CSV Files into DynamoDB . In this lab we upload a .csv file into S3, an S3 event is generated in a specified bucket which then kicks off the Lambda function shown below. The function parses the .csv then uploads the contents into DynamoDB. I was originally having issues with Line 23: items = read_csv(download

When the underlying data changes, do we need to drop and create the partition in Hive?

余生长醉 提交于 2020-01-06 04:46:46
问题 Let's say I have a hive table partitioned by date with its data stored in S3 as Parquet files. Let's also assume that for a particular partition (date), there were originally 20 records. If I then delete the original files and put new Parquet files with 50 records in the same folder, do I need to drop and recreate that partition for the new data to reflect? My understanding was that we don't have to recreate partitions. So I tried removing old data from the respective folder and keeping the

Accessing a specific key in a s3 bucket using boto3

て烟熏妆下的殇ゞ 提交于 2020-01-06 04:41:18
问题 I am trying to access a specific object in my s3 bucket using boto3 for deletion. the code below is from the boto3 documentation. https://boto3.readthedocs.io/en/latest/guide/migrations3.html#accessing-a-bucket # Boto 3 for key in bucket.objects.all(): key.delete() great but i would much rather have a dictionary reference then iterating though objects. That isn't the greatest at scaling. Is there a way to grab an object using its key? edit: I attempted this but it didnt work. Looking through

Lambda script to direct to fallback S3 domain subfolder when not found

你说的曾经没有我的故事 提交于 2020-01-06 04:36:09
问题 As per this question, and this one the following piece of code, allows me to point a subfolder in a S3 bucket to my domain. However in instances where the subdomain is not found, I get the following error message: <Error> <Code>AccessDenied</Code> <Message>Access Denied</Message> <RequestId>2CE9B7837081C817</RequestId> <HostId> T3p7mzSYztPhXetUu7GHPiCFN6l6mllZgry+qJWYs+GFOKMjScMmRNUpBQdeqtDcPMN3qSYU/Fk= </HostId> </Error> I would not like it to display this error message, instead in instances

How to convert json files stored in s3 to csv using glue?

為{幸葍}努か 提交于 2020-01-06 04:31:39
问题 I have some json files stored in s3, and I need to convert them, at the folder folder they are, to csv format. Currently I'm using glue to map them to athena, but, as I said, now I need to map them to csv. Is it possible to use a Glue JOB to do that? I trying to understand if a glue job can crawl into my s3 folder directories, converting all json files it finds to csv (as new files). If not possible, is there any aws service that could help me do that? EDIT1: Here's the current code i'm

Can't find image in aws s3 despite successful upload

左心房为你撑大大i 提交于 2020-01-06 03:07:01
问题 I am utilising ng-file-upload module <?xml version="1.0" encoding="UTF-8"?> <CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/"> <CORSRule> <AllowedOrigin>*</AllowedOrigin> <AllowedMethod>GET</AllowedMethod> <AllowedMethod>DELETE</AllowedMethod> <AllowedMethod>POST</AllowedMethod> <AllowedMethod>PUT</AllowedMethod> <MaxAgeSeconds>3000</MaxAgeSeconds> <AllowedHeader>*</AllowedHeader> <AllowedHeader>Authorization</AllowedHeader> </CORSRule> </CORSConfiguration> I found similar

Can't find image in aws s3 despite successful upload

自闭症网瘾萝莉.ら 提交于 2020-01-06 03:06:01
问题 I am utilising ng-file-upload module <?xml version="1.0" encoding="UTF-8"?> <CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/"> <CORSRule> <AllowedOrigin>*</AllowedOrigin> <AllowedMethod>GET</AllowedMethod> <AllowedMethod>DELETE</AllowedMethod> <AllowedMethod>POST</AllowedMethod> <AllowedMethod>PUT</AllowedMethod> <MaxAgeSeconds>3000</MaxAgeSeconds> <AllowedHeader>*</AllowedHeader> <AllowedHeader>Authorization</AllowedHeader> </CORSRule> </CORSConfiguration> I found similar

How to upload multipart files in aws s3 bucket using javascript sdk in browser

给你一囗甜甜゛ 提交于 2020-01-06 02:54:09
问题 I have tried using the function var bucket = new AWS.S3({params: {Bucket: 'mybucket'}}); var params = {Key: file.name, ContentType: file.type}; bucket.createMultipartUpload(params, function (err, data) { if (err) console.log(err, err.stack); // an error occurred else console.log(data); // successful response }); Its getting successful response consists of uploadedid but i cant find the file in the s3 bucket. So is there any other ways for multipart upload files from javascript sdk from

Uploading image to S3 using phonegap, how to?

拥有回忆 提交于 2020-01-05 23:29:26
问题 I'm trying to get some image onto a S3 but not quite succeeding... Here is my work so far $cordovaCamera.getPicture(options).then(function(imageURI) { // imageURI will be something like: file:///some_path // get the base64 data from the image var img = Utils.encodeImageUri(imageURI); // get the base64 from a new image, not sure if this is needed var image = new Image(); image.src = img; Utils.uploadToS3(image.src); }, function(err) {}) ... // boilerplate function to create a Blob

Uploading image to S3 using phonegap, how to?

前提是你 提交于 2020-01-05 23:28:11
问题 I'm trying to get some image onto a S3 but not quite succeeding... Here is my work so far $cordovaCamera.getPicture(options).then(function(imageURI) { // imageURI will be something like: file:///some_path // get the base64 data from the image var img = Utils.encodeImageUri(imageURI); // get the base64 from a new image, not sure if this is needed var image = new Image(); image.src = img; Utils.uploadToS3(image.src); }, function(err) {}) ... // boilerplate function to create a Blob