AWS S3: how do I see how much disk space is using

前端 未结 18 1341
被撕碎了的回忆
被撕碎了的回忆 2020-12-12 10:33

I have AWS account. I\'m using S3 to store backups from different servers. The question is there any information in the AWS console about how much disk space is in use in my

相关标签:
18条回答
  • 2020-12-12 10:42

    In addition to Christopher's answer.

    If you need to count total size of versioned bucket use:

    aws s3api list-object-versions --bucket BUCKETNAME --output json --query "[sum(Versions[].Size)]"
    

    It counts both Latest and Archived versions.

    0 讨论(0)
  • 2020-12-12 10:43

    You asked: information in AWS console about how much disk space is using on my S3 cloud?

    I so to the Billing Dashboard and check the S3 usage in the current bill.

    They give you the information - MTD - in Gb to 6 decimal points, IOW, to the Kb level.

    It's broken down by region, but adding them up (assuming you use more than one region) is easy enough.

    BTW: You may need specific IAM permissions to get to the Billing information.

    0 讨论(0)
  • 2020-12-12 10:44

    Yippe - an update to AWS CLI allows you to recursively ls through buckets...

    aws s3 ls s3://<bucketname> --recursive  | grep -v -E "(Bucket: |Prefix: |LastWriteTime|^$|--)" | awk 'BEGIN {total=0}{total+=$3}END{print total/1024/1024" MB"}'
    
    0 讨论(0)
  • 2020-12-12 10:44

    To find out size of S3 bucket using AWS Console:

    1. Click the S3 bucket name
    2. Select "Management" tab
    3. Click "Metrics" navigation button
    4. By default you should see Storage metric of the bucket

    Hope this helps.

    0 讨论(0)
  • 2020-12-12 10:44

    On linux box that have python (with pip installer), grep and awk, install AWS CLI (command line tools for EC2, S3 and many other services)

    sudo pip install awscli
    

    then create a .awssecret file in your home folder with content as below (adjust key, secret and region as needed):

    [default]
    aws_access_key_id=<YOUR_KEY_HERE>
    aws_secret_access_key=<YOUR_SECRET_KEY_HERE>
    region=<AWS_REGION>
    

    Make this file read-write to your user only:

    sudo chmod 600 .awssecret
    

    and export it to your environment

     export AWS_CONFIG_FILE=/home/<your_name>/.awssecret
    

    then run in the terminal (this is a single line command, separated by \ for easy reading here):

    aws s3 ls s3://<bucket_name>/foo/bar | \
    grep -v -E "(Bucket: |Prefix: |LastWriteTime|^$|--)" | \
    awk 'BEGIN {total=0}{total+=$3}END{print total/1024/1024" MB"}'
    
    • the aws part lists the bucket (or optionally a 'sub-folder')
    • the grep part removes (using -v) the lines that match the Regular Expression (using -E). ^$ is for blank line, -- is for the separator lines in the output of aws s3 ls
    • the last awk simply add to total the 3rd colum of the resulting output (the size in KB) then display it at the end

    NOTE this command works for the current bucket or 'folder', not recursively

    0 讨论(0)
  • 2020-12-12 10:44

    See https://serverfault.com/questions/84815/how-can-i-get-the-size-of-an-amazon-s3-bucket

    Answered by Vic...

    <?php
    if (!class_exists('S3')) require_once 'S3.php';
    
    // Instantiate the class
    $s3 = new S3('accessKeyId', 'secretAccessKey');
    S3::$useSSL = false;
    
    // List your buckets:
    echo "S3::listBuckets(): ";
    echo '<pre>' . print_r($s3->listBuckets(), 1). '</pre>';
    
    $totalSize = 0;
    $objects = $s3->getBucket('name-of-your-bucket');
    foreach ($objects as $name => $val) {
        // If you want to get the size of a particular directory, you can do
        // only that.
        // if (strpos($name, 'directory/sub-directory') !== false)
        $totalSize += $val['size'];
    }
    
    echo ($totalSize / 1024 / 1024 / 1024) . ' GB';
    ?>
    
    0 讨论(0)
提交回复
热议问题