How to upload all files of a specific type to S3 Bucket?

北城以北 提交于 2019-12-08 16:53:30

问题


When I do this:

foreach ($f in (Get-ChildItem -filter "*.flv")){
    Write-S3Object -BucketName bucket.example -File $f.fullName -Key $f.name -CannedACLName PublicRead
}

I get this error:

Write-S3Object :
At line:1 char:51
+  foreach ($f in (Get-ChildItem -filter "*.flv")){ Write-S3Object -BucketName xx. ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (Amazon.PowerShe...eS3ObjectCmdlet:WriteS3ObjectCmdlet) [Write-S3Objec
   t], InvalidOperationException
    + FullyQualifiedErrorId : Amazon.S3.AmazonS3Exception,Amazon.PowerShell.Cmdlets.S3.WriteS3ObjectCmdlet

What am I doing wrong? Is there anything I can do to see more of the error, or is this just a syntax issue?

How can I otherwise upload all of a certain filetype to a bucket using powershell?

EDIT:

I intentionally set Set-DefaultAWSRegion to a region that the bucket wasn't in, and got

Write-S3Object : The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint. 

as an error message, as expected, so it looks like it can connect to the bucket and it knows that it isn't in a certain region.

Also, if I enter the s3:// prefix before the bucket name, I get a message that the bucket couldn't be found, so it it looks like what I'm entering now is correct.

I can do Get-S3Bucket and see all of the buckets on my account, so I know that it's configured correctly.

EDIT2:

If I do:

> $f = Get-ChildItem -filter "*.flv"
> Write-S3Object

cmdlet Write-S3Object at command pipeline position 1
Supply values for the following parameters:
BucketName: bucket.name
Key: $f[0].name
File: $f[0].fullName
Write-S3Object : The file indicated by the FilePath property does not exist!
At line:1 char:1
+ Write-S3Object
+ ~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (Amazon.PowerShe...eS3ObjectCmdlet:WriteS3ObjectCmdlet) [Write-S3Objec
   t], InvalidOperationException
    + FullyQualifiedErrorId : System.ArgumentException,Amazon.PowerShell.Cmdlets.S3.WriteS3ObjectCmdlet

If I do $f[0].fullName seperately, I get the full path to the object. However, this has spaces in it. Could this be a problem?


回答1:


When you fill in missing parameters like that from the command line, you need to specify their literal string values. When I mimicked your issue locally:

PS C:\> Write-S3Object

cmdlet Write-S3Object at command pipeline position 1
Supply values for the following parameters:
BucketName: MyTestBucketNameHere
Key: $testName
File: C:/test.txt

I ended up with a file on S3 whose key was named $testName, because variables aren't evaluated in that context. Likewise, you're getting this "The file indicated by the FilePath property does not exist!" error because there is no file in your filesystem named $f[0].fullName.

An example to write a single file to S3:

PS C:> Write-S3Object -BucketName "MyTestBucketName" -Key "file.txt" -File "C:/test.txt"

To write all of your files to S3:

PS C:\> (Get-ChildItem -filter "*.flv") | % { Write-S3Object -BucketName "MyTestBucketName" -File $_ -Key $_.name}

This will first get all files with the flv file-type in your current directory, and for each object (represented by the percent sign) we will write the file (represented by $_) to MyTestBucketName with a Key that is the name property of the current file being iterated.




回答2:


The parameter -CannedACLName PublicRead is not correct, correct parameter is

foreach ($f in (Get-ChildItem -filter "*.flv")){ Write-S3Object -BucketName bucket.example -File $f.fullName -Key $f.name -CannedACLName public-read }

This fixed the issue for me and should fix for you as well.




回答3:


For me, none of these solutions worked so what I found is this.

When you're in Powershell ISE (4.0) it doesn't matter how you send the local filename, it will know its location, which means you can simply do this, in my example I'm trying to upload all the backups of a folder named e:\Backups to a bucket in S3:

$results = Get-ChildItem -Path E:\Backups

foreach ($f in $results) { 

  $filename = [System.IO.Path]::GetFileName($f)

  Write-S3Object -BucketName bi-dbs-daily-backup -File $f -Key $filename

}

If I run this in Powershell ISE everything works fine, but If I create a .ps1 and try to run it with the Task Scheduler I get: The file indicated by the FilePath property does not exist!

It turns out that when Windows tries to run the .ps1 it does it basically from the CLI, and when it does, it stops recognizing the path of the files you're sending.

So I added | % { $_.FullName } to GetChildItem to get the fullpath of each file and now everything works fine:

$results = Get-ChildItem -Path E:\Backups **| % { $_.FullName }** 

foreach ($f in $results) { 

  $filename = [System.IO.Path]::GetFileName($f)

  Write-S3Object -BucketName bi-dbs-daily-backup -File $f -Key $filename

}

Hope this helps someone out there!




回答4:


For other newbs out there looking for a script that does this but not for public access remove the -CannedACLName public-read altogether. Took me 2 frustrating hours to figure that out. :)



来源:https://stackoverflow.com/questions/23918359/how-to-upload-all-files-of-a-specific-type-to-s3-bucket

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!