amazon-glacier

AWS: Ways of keeping cost down while backing up S3 files to Glacier? [closed]

↘锁芯ラ 提交于 2020-01-01 02:29:33
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 5 years ago . As part of our project, we have created quite a bushy folder/file tree on S3 with all the files taking up about 6TB of data. We currently have no backup of this data which is bad. We want to do periodic back ups. Seems like Glacier is the way to go. The question is: what are the ways to keep the total cost of a

Is it possible to move EC2 volumes to Amazon Glacier without having to download and upload it?

别说谁变了你拦得住时间么 提交于 2019-12-29 06:48:29
问题 I am trying to reduce costs of my AWS system. I thought of moving some volumes I rarely use to Amazon Glacier, but I don't find any way I can do it inside AWS besides downloading the volume and uploading to Glacier, which sounds terrible. I am wondering, is it possible to do this automatically? Assign some EC2 volumes to glacier directly from the EC2 console? Thanks 回答1: EBS volumes cannot be copied or migrated to Amazon Glacier. EBS snapshots, even though they are stored in Amazon S3, also

Is it possible to move EC2 volumes to Amazon Glacier without having to download and upload it?

ⅰ亾dé卋堺 提交于 2019-12-29 06:48:11
问题 I am trying to reduce costs of my AWS system. I thought of moving some volumes I rarely use to Amazon Glacier, but I don't find any way I can do it inside AWS besides downloading the volume and uploading to Glacier, which sounds terrible. I am wondering, is it possible to do this automatically? Assign some EC2 volumes to glacier directly from the EC2 console? Thanks 回答1: EBS volumes cannot be copied or migrated to Amazon Glacier. EBS snapshots, even though they are stored in Amazon S3, also

Asynchronous amazon glacier download

ぃ、小莉子 提交于 2019-12-25 06:55:20
问题 I want to asynchronously download multiple glacier files using thread pool. My current approach uses High Level API for Glacier download, but every thread waits at download method until the download job is complete. Below is the code where all the treads are waiting ArchiveTransferManager manager = new ArchiveTransferManager(Amazon.RegionEndpoint.USEast1); DownloadOptions options = new DownloadOptions(); manager.Download(vaultName, archiveId, downloadFilePath, options); Can somebody please

How to calculate sha256 for large files in PHP

假如想象 提交于 2019-12-25 05:23:25
问题 I would like to ask your assistance on how to calculate sha256 of large files in PHP. Currently, I used Amazon Glacier to store old files and use their API to upload the archive. Initially, I just used small files that cannot reach to MB-sized images. When I tried to upload more than 1MB, the API response said that the checksum I gave to them is different from what they had calculated. Here is my code to upload the file: //get the sha256 using the file path $image = //image path; $sha256 =

Export RDS data to S3/Glacier

蓝咒 提交于 2019-12-24 12:00:46
问题 I want to export data from Oracle RDS to S3 and then move it to Glacier. My end goal is to take the backup of the data stored in RDS to S3 and Glacier to meet compliance requirements. Could anyone please suggest the best approach to achieve it. 回答1: RDS snapshots are stored in s3, but you are not able to download them or set policies on them that would back them up to glacier. So, you will have to do this manually. Set up an instance with enough disk space to store a dump of your database.

How to move object from Amazon S3 to Glacier with Vault Locked enabled?

大兔子大兔子 提交于 2019-12-24 10:54:18
问题 I'm looking for a solution for moving Amazon S3 objects to Glacier with Vault Lock enabled (like described here https://aws.amazon.com/blogs/aws/glacier-vault-lock/). I'd like to use Amazon built in tools for that (lifecycle management or some other) if possible. I cannot find any instructions or options to do that. S3 seems to only allow moving object to Glacier storage class. But that does not provide data integrity nor defends against data loss. I know I could do it with a program. It

PHP AWS api raw request to PUT bucket lifecycle

可紊 提交于 2019-12-24 08:57:53
问题 I am creating a website, in which there is a feature that if user delete a image/video it will be archived, I am using AWS S3 for storing and on delete want to move it on Glacier, I dont want to use AWS SDK, so i am creating Raw request using PHP cURL, from this link i tried to Put bucket lifecycle on an object, http://docs.aws.amazon.com/AmazonS3/latest/API/RESTBucketPUTlifecycle.html and done some code, but it give me error of mismatch signature, SignatureDoesNotMatch-The request signature

Boto Glacier - Upload file larger than 4 GB using multipart upload

不羁岁月 提交于 2019-12-22 11:13:31
问题 I am periodically uploading a file to AWS Glacier using boto as follows: # Import boto's layer2 import boto.glacier.layer2 # Create a Layer2 object to connect to Glacier l = boto.glacier.layer2.Layer2(aws_access_key_id=awsAccess, aws_secret_access_key=awsSecret) # Get a vault based on vault name (assuming you created it already) v = l.get_vault(vaultName) # Create an archive from a local file on the vault archiveID = v.create_archive_from_file(fileName) However this fails for files that are

Amazon AWS Athena S3 and Glacier Mixed Bucket

百般思念 提交于 2019-12-20 09:43:22
问题 Amazon Athena Log Analysis Services with S3 Glacier We have petabytes of data in S3. We are https://www.pubnub.com/ and we store usage data in S3 of our network for billing purposes. We have tab delimited log files stored in an S3 bucket. Athena is giving us a HIVE_CURSOR_ERROR failure. Our S3 bucket is setup to automatically push to AWS Glacier after 6 months. Our bucket has S3 files hot and ready to read in addition to the Glacier backup files. We are getting access errors from Athena