AWS S3: how do I see how much disk space is using

前端 未结 18 1342
被撕碎了的回忆
被撕碎了的回忆 2020-12-12 10:33

I have AWS account. I\'m using S3 to store backups from different servers. The question is there any information in the AWS console about how much disk space is in use in my

相关标签:
18条回答
  • 2020-12-12 10:46

    I use Cloud Turtle to get the size of individual buckets. If the bucket size exceeds >100 Gb, then it would take some time to display the size. Cloud turtle is freeware.

    0 讨论(0)
  • 2020-12-12 10:47

    The AWS CLI now supports the --query parameter which takes a JMESPath expressions.

    This means you can sum the size values given by list-objects using sum(Contents[].Size) and count like length(Contents[]).

    This can be be run using the official AWS CLI as below and was introduced in Feb 2014

     aws s3api list-objects --bucket BUCKETNAME --output json --query "[sum(Contents[].Size), length(Contents[])]"
    
    0 讨论(0)
  • 2020-12-12 10:51

    I'm not sure when this was added to the AWSCLI given that the original question was 3 years ago, but the command line tool gives a nice summary by running:

    aws s3 ls s3://mybucket --recursive --human-readable --summarize
    
    0 讨论(0)
  • 2020-12-12 10:51

    Cloud watch also allows you to create metrics for your S3 bucket. It shows you metrics by the size and object count. Services> Management Tools> Cloud watch. Pick the region where your S3 bucket is and the size and object count metrics would be among those available metrics.

    0 讨论(0)
  • 2020-12-12 10:51

    Getting large buckets size via API (either aws cli or s4cmd) is quite slow. Here's my HowTo explaining how to parse S3 Usage Report using bash one liner:

    cat report.csv | awk -F, '{printf "%.2f GB %s %s \n", $7/(1024**3 )/24, $4, $2}' | sort -n
    
    0 讨论(0)
  • 2020-12-12 10:54

    Based on @cudds's answer:

    function s3size()
    {
        for path in $*; do
            size=$(aws s3 ls "s3://$path" --recursive | grep -v -E "(Bucket: |Prefix: |LastWriteTime|^$|--)" | awk 'BEGIN {total=0}{total+=$3}END{printf "%.2fGb\n", (total/1024/1024/1024)}')
            echo "[s3://$path]=[$size]"
        done
    }
    
    ...
    
    $ s3size bucket-a bucket-b/dir
    [s3://bucket-a]=[24.04Gb]
    [s3://bucket-b/dir]=[26.69Gb]
    

    Also, Cyberduck conveniently allows for calculation of size for a bucket or a folder.

    0 讨论(0)
提交回复
热议问题