amazon-s3

`An error occurred (InvalidToken) when calling the ListBuckets operation: The provided token is malformed or otherwise invalid.` w/`aws s3 ls`

梦想的初衷 提交于 2020-01-04 01:56:37
问题 I successfully authenticate with 2 factor but when using aws s3 ls I keep getting An error occurred (InvalidToken) when calling the ListBuckets operation: The provided token is malformed or otherwise invalid. And I do have admin rights. 回答1: Issue was that I wasn't passing the --region in. e.g. aws s3 --region us-gov-west-1 ls . I suppose this could be set with an ENV variable too. That error message is a candidate for improvement. 回答2: run aws configure 1. you may leave access key and access

Data Structure Behind Amazon S3s Keys (Filtering Data Structure)

此生再无相见时 提交于 2020-01-04 01:44:49
问题 I'd like to implement a data structure similar to the lookup functionality of Amazon S3. For context, Amazon S3 stores all files in a flat namespace, but allows you to look up groups of files by common prefixes in their names, therefore replicating the power of a directory tree without the complexity of it. The catch is, both lookup and filter operations are O(1) (or close enough that even on very large buckets - S3's disk equivalents - both operations might as well be O(1))). So in short, I

PDFs in Amazon S3 don't open in Chrome for view

走远了吗. 提交于 2020-01-04 01:26:30
问题 I have a website which is developed in PHP and hosted in amazon server. PDF files that are uploaded in server are not opening for view in chrome browser but this pdf file is opening in other browser (internet explorer) for viewing. In chrome it is downloaded. I want this pdf to be open for viewing. code for link is <a href="<?php echo $filename;?>" target="_blank"><?php echo $data['File_Label'];?></a> URL: Please click here But below file is opening for viewing in chrome Please check this 回答1

amazon aws-s3 access denied error

帅比萌擦擦* 提交于 2020-01-03 20:07:32
问题 I wanted to integrate amazon s3 service with my rails application. I am using paperclip(2.3.6) gem and aws-s3(0.6.2) gem for this. but as the user uploads a file, It throws an error Access denied. I am able to put and get the file if i am trying from rails console. using the same credentials. 回答1: Did you set up the ACL permissions ? http://www.bucketexplorer.com/ is a good tool for it. There are several S3 clients in all OS's. 来源: https://stackoverflow.com/questions/4322187/amazon-aws-s3

Error when loading a Angular app with AOT build on Amazon S3

自古美人都是妖i 提交于 2020-01-03 17:04:17
问题 I have hosted my Angular App which is compiled using AOT compiler on Amazon S3. However, when I try to load the app it gives me the following error The app runs perfectly well with ng serve --aot but the same error will come up if I serve the distribution on python http-server locally. Update: I have tried different workarounds and ways to try to get it working. But i haven't been able to fix it. I suspect that the error throws when trying to do a production build with AOT compiler. When

s3fs gzip compression on pandas dataframe

梦想与她 提交于 2020-01-03 16:48:12
问题 I'm trying to write a dataframe as a CSV file on S3 by using the s3fs library and pandas. Despite the documentation, I'm afraid the gzip compression parameter it's not working with s3fs. def DfTos3Csv (df,file): with fs.open(file,'wb') as f: df.to_csv(f, compression='gzip', index=False) This code saves the dataframe as a new object in S3 but in a plain CSV not in a gzip format. On the other hand, the read functionality it's working OK using this compression parameter. def s3CsvToDf(file):

How to get the first 100 lines of a file on S3?

|▌冷眼眸甩不掉的悲伤 提交于 2020-01-03 16:44:42
问题 I have a huge (~6 GB) file on Amazon S3 and want to get the first 100 lines of it without having to download the whole thing. Is this possible? Here's what I'm doing now: aws cp s3://foo/bar - | head -n 100 But this takes a while to execute. I'm confused -- shouldn't head close the pipe once it's read enough lines, causing aws cp to crash with a BrokenPipeError before it has time to download the entire file? 回答1: Using the Range HTTP header in a GET request, you can retrieve a specific range

Amazon S3 Write Only access

无人久伴 提交于 2020-01-03 15:58:23
问题 I'm backing up files from several customers directly into an Amazon S3 bucket - each customer to a different folder. I'm using a simple .Net client running under a Windows task once a night. To allow writing to the bucket, my client requires both the AWS access key and the secret key (I created a new pair). My problem is: How do I make sure none of my customers could potentially use the pair to peek in the bucket and in a folder not his own? Can I create a "write only" access pair? Am I

Amazon S3 Write Only access

£可爱£侵袭症+ 提交于 2020-01-03 15:57:02
问题 I'm backing up files from several customers directly into an Amazon S3 bucket - each customer to a different folder. I'm using a simple .Net client running under a Windows task once a night. To allow writing to the bucket, my client requires both the AWS access key and the secret key (I created a new pair). My problem is: How do I make sure none of my customers could potentially use the pair to peek in the bucket and in a folder not his own? Can I create a "write only" access pair? Am I

CreateMultipartUpload operation - AWS policy items needed?

南笙酒味 提交于 2020-01-03 15:34:13
问题 I'm doing multipart upload via aws cli console but getting this error; A client error (AccessDenied) occurred when calling the CreateMultipartUpload operation: Access Denied Below is my policy, am I missing something in there? Thanks. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:ListAllMyBuckets" ], "Resource": "arn:aws:s3:::*" }, { "Effect": "Allow", "Action": [ "s3:ListBucket", "s3:GetBucketLocation" ], "Resource": "arn:aws:s3:::mybucket" }, { "Effect":