Windows Azure Powershell Copying file to VM

≯℡__Kan透↙ 提交于 2019-11-30 14:04:13

You cannot use PowerShell to copy a file directly to a Virtual Machine's OS disk (or even to one of its attached disks). There's no API for communicating directly with a Virtual Machine's innards (you'd need to create your own custom service for that.

You can use PowerShell to upload a file to a Blob, with Set-AzureStorageBlobContent. At that point, you could notify your running app (possibly with a Queue message?) on your Virtual Machine that there's a file waiting for it to process. And the processing could be as simple as copying the file down to the VM's local disk.

Here is ano ther approach that I documented here. It involves

  1. Creating and mounting an empty local VHD.
  2. Copying your files to the new VHD and dismount it.
  3. Copy the VHD to azure blob storage
  4. Attach that VHD to your VM.

Here is an example:

#Create and mount a new local VHD
$volume = new-vhd -Path test.vhd -SizeBytes 50MB | `
  Mount-VHD -PassThru | `
  Initialize-Disk -PartitionStyle mbr -Confirm:$false -PassThru | `
  New-Partition -UseMaximumSize -AssignDriveLetter -MbrType IFS | `
  Format-Volume -NewFileSystemLabel "VHD" -Confirm:$false

#Copy my files  
Copy-Item C:\dev\boxstarter "$($volume.DriveLetter):\" -Recurse
Dismount-VHD test.vhd

#upload the Vhd to azure
Add-AzureVhd -Destination http://mystorageacct.blob.core.windows.net/vhdstore/test.vhd `
  -LocalFilePath test.vhd

#mount the VHD to my VM
Get-AzureVM MyCloudService MyVMName | `
  Add-AzureDataDisk -ImportFrom `
  -MediaLocation "http://mystorageacct.blob.core.windows.net/vhdstore/test.vhd" `
  -DiskLabel "boxstarter" -LUN 0 | `
  Update-AzureVM
EnglishJimbob

Here is some code that I got from some powershell examples and modified. It works over a session created with New-PSSession. There's a cool wrapper for that also included below. Lastly, I needed to send a whole folder over so that's here too..

Some example usage for tying them together

# open remote session
$session = Get-Session -uri $uri -credentials $credential 
# copy installer to VM
Write-Verbose "Checking if file $installerDest needs to be uploaded"
Send-File -Source $installerSrc -Destination $installerDest -Session $session -onlyCopyNew $true



<#
.SYNOPSIS
  Returns a session given the URL 
.DESCRIPTION
  http://michaelcollier.wordpress.com/2013/06/23/using-remote-powershell-with-windows-azure-vms/
#>
function Get-Session($uri, $credentials)
{
    for($retry = 0; $retry -le 5; $retry++)
    {
      try
      {
        $session = New-PSSession -ComputerName $uri[0].DnsSafeHost -Credential $credentials -Port $uri[0].Port -UseSSL
        if ($session -ne $null)
        {
            return $session
        }

        Write-Output "Unable to create a PowerShell session . . . sleeping and trying again in 30 seconds."
        Start-Sleep -Seconds 30
      }
      catch
      {
        Write-Output "Unable to create a PowerShell session . . . sleeping and trying again in 30 seconds."
        Start-Sleep -Seconds 30
      }
    }
}

<#
.SYNOPSIS
  Sends a file to a remote session.
  NOTE: will delete the destination before uploading
.EXAMPLE
  $remoteSession = New-PSSession -ConnectionUri $remoteWinRmUri.AbsoluteUri -Credential $credential
  Send-File -Source "c:\temp\myappdata.xml" -Destination "c:\temp\myappdata.xml" $remoteSession

  Copy the required files to the remote server 

    $remoteSession = New-PSSession -ConnectionUri $frontEndwinRmUri.AbsoluteUri -Credential $credential
    $sourcePath = "$PSScriptRoot\$remoteScriptFileName"
    $remoteScriptFilePath = "$remoteScriptsDirectory\$remoteScriptFileName"
    Send-File $sourcePath $remoteScriptFilePath $remoteSession

    $answerFileName = Split-Path -Leaf $WebPIApplicationAnswerFile
    $answerFilePath = "$remoteScriptsDirectory\$answerFileName"
    Send-File $WebPIApplicationAnswerFile $answerFilePath $remoteSession
    Remove-PSSession -InstanceId $remoteSession.InstanceId
#>
function Send-File
{
    param (

        ## The path on the local computer
        [Parameter(Mandatory = $true)]
        [string]
        $Source,

        ## The target path on the remote computer
        [Parameter(Mandatory = $true)]
        [string]
        $Destination,

        ## The session that represents the remote computer
        [Parameter(Mandatory = $true)]
        [System.Management.Automation.Runspaces.PSSession] 
        $Session,

        ## should we quit if file already exists?
        [bool]
        $onlyCopyNew = $false

        )

    $remoteScript =
    {
        param ($destination, $bytes)

        # Convert the destination path to a full filesystem path (to supportrelative paths)
        $Destination = $ExecutionContext.SessionState.`
        Path.GetUnresolvedProviderPathFromPSPath($Destination)

        # Write the content to the new file
        $file = [IO.File]::Open($Destination, "OpenOrCreate")
        $null = $file.Seek(0, "End")
        $null = $file.Write($bytes, 0, $bytes.Length)
        $file.Close()
    }

    # Get the source file, and then start reading its content
    $sourceFile = Get-Item $Source

    # Delete the previously-existing file if it exists
    $abort = Invoke-Command -Session $Session {
        param ([String] $dest, [bool]$onlyCopyNew)

        if (Test-Path $dest) 
        { 
            if ($onlyCopyNew -eq $true)
            {
                return $true
            }

            Remove-Item $dest
        }

        $destinationDirectory = Split-Path -Path $dest -Parent
         if (!(Test-Path $destinationDirectory))
        {
            New-Item -ItemType Directory -Force -Path $destinationDirectory 
        }

        return $false
    } -ArgumentList $Destination, $onlyCopyNew

    if ($abort -eq $true)
    {
        Write-Host 'Ignored file transfer - already exists'
        return
    }

    # Now break it into chunks to stream
    Write-Progress -Activity "Sending $Source" -Status "Preparing file"
    $streamSize = 1MB
    $position = 0
    $rawBytes = New-Object byte[] $streamSize
    $file = [IO.File]::OpenRead($sourceFile.FullName)
    while (($read = $file.Read($rawBytes, 0, $streamSize)) -gt 0)
    {
        Write-Progress -Activity "Writing $Destination" -Status "Sending file" `
            -PercentComplete ($position / $sourceFile.Length * 100)

        # Ensure that our array is the same size as what we read from disk
        if ($read -ne $rawBytes.Length)
        {
            [Array]::Resize( [ref] $rawBytes, $read)
        }

        # And send that array to the remote system
        Invoke-Command -Session $session $remoteScript -ArgumentList $destination, $rawBytes

        # Ensure that our array is the same size as what we read from disk
        if ($rawBytes.Length -ne $streamSize)
        {
            [Array]::Resize( [ref] $rawBytes, $streamSize)
        }
        [GC]::Collect()
        $position += $read
    }

    $file.Close()

    # Show the result
    Invoke-Command -Session $session { Get-Item $args[0] } -ArgumentList $Destination
}

<#
.SYNOPSIS
  Sends all files in a folder to a remote session.
  NOTE: will delete any destination files before uploading
.EXAMPLE
  $remoteSession = New-PSSession -ConnectionUri $remoteWinRmUri.AbsoluteUri -Credential $credential
  Send-Folder -Source 'c:\temp\' -Destination 'c:\temp\' $remoteSession
#>
function Send-Folder 
{
    param (
        ## The path on the local computer
        [Parameter(Mandatory = $true)]
        [string]
        $Source,

        ## The target path on the remote computer
        [Parameter(Mandatory = $true)]
        [string]
        $Destination,

        ## The session that represents the remote computer
     #   [Parameter(Mandatory = $true)]
        [System.Management.Automation.Runspaces.PSSession] 
        $Session,

        ## should we quit if files already exist?
        [bool]
        $onlyCopyNew = $false
    )

    foreach ($item in Get-ChildItem $Source)
    {
        if (Test-Path $item.FullName -PathType Container) {
            Send-Folder $item.FullName "$Destination\$item" $Session $onlyCopyNew
        } else {
            Send-File -Source $item.FullName -Destination "$destination\$item" -Session $Session -onlyCopyNew $onlyCopyNew
        }
    }
}
  1. Install AzCopy from http://aka.ms/downloadazcopy
  2. Read docs from: https://docs.microsoft.com/en-us/azure/storage/storage-use-azcopy
  3. cd "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy"
  4. Get Blob Storage (Secondary) Key
  5. Powershell: Blob Upload single file

    .\AzCopy /Source:C:\myfolder /Dest:https://myaccount.blob.core.windows.net/mycontainer/myfolder/ /DestKey:key /Pattern:abc.txt
  1. Logon to Remote VM

  2. Powershell: Blob Download single file

    .\AzCopy /Source:https://myaccount.file.core.windows.net/myfileshare/myfolder/ /Dest:C:\myfolder /SourceKey:key /Pattern:abc.txt

Another solution is to use a Custom Extension Script.
Using a custom extension script allows you to copy file to the VM even if the VM does not have a public ip (private network). So you don't need to configure winRm or anything.

I've used custom extension scripts in the past for post-deployment like installing an app on a VM or a Scale Set. Basically you upload files to blob storage and the custom extension script will download these file on the VM.

I've created a test-container on my blob storage account and uploaded two files:

  • deploy.ps1: the script executed on the VM.
  • test.txt: a text file with "Hello world from VM"

Here is the code of the deploy.ps1 file:

Param(
    [string] [Parameter(Mandatory=$true)] $filename,
    [string] [Parameter(Mandatory=$true)] $destinationPath
)

# Getting the full path of the downloaded file
$filePath = $PSScriptRoot + "\" + $filename

Write-Host "Checking the destination folder..." -Verbose
if(!(Test-Path $destinationPath -Verbose)){
    Write-Host "Creating the destination folder..." -Verbose
    New-Item -ItemType directory -Path $destinationPath -Force -Verbose
}

Copy-Item $filePath -Destination $destinationPath -Force -Verbose

Here is the code to add a custom script extension to a virtual machine.

Login-AzureRMAccount

$resourceGroupName = "resourcegroupname"
$storageAccountName = "storageaccountname"
$containerName = "test-container"
$location = "Australia East"
$vmName = "TestVM"
$extensionName = "copy-file-to-vm"
$filename = "test.txt"
$deploymentScript = "deploy.ps1"
$destintionPath = "C:\MyTempFolder\"

$storageAccountKeys = (Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName).Value
$storageAccountKey = $storageAccountKeys[0]

Set-AzureRmVMCustomScriptExtension -ResourceGroupName $resourceGroupName -VMName $vmName -Name $extensionName -Location $location -TypeHandlerVersion "1.9" -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey -ContainerName $containerName -FileName $deploymentScript, $filename -Run $deploymentScript -Argument "$filename $destintionPath" -ForceRerun "1"

You can remove the extension after the file has been copied:

Remove-AzureRmVMCustomScriptExtension -ResourceGroupName $resourceGroupName -VMName $vmName -Name $extensionName -Force

In my scenario, I have a logic app that is triggered every time a new file is added to a container. The logic app call a runbook (required an azure automation account) that add the custom script extension then delete it.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!