问题
I have a number of pipeline/linkedservice/dataset json files and I need to upload them to my Data Factory, opposed to creating new versions and copying the text over. Whats the simplest way to do this?
回答1:
If you are using version 1, you can use Visual Studio to do so as shown here https://azure.microsoft.com/en-us/blog/azure-data-factory-visual-studio-extension-for-authoring-pipelines/
If you are using version 2, you can do this using powershell. First download and install the azure sdk for powershell from here: https://azure.microsoft.com/en-us/downloads/ Then from powershell, login and select subscription:
Login-AzureRmAccount
Select-AzureRmSubscription -SubscriptionName "your subs name here"
Then with the following command you can upload the json files:
Set-AzureRmDataFactoryV2Pipeline -DataFactoryName "your df name" -ResourceGroupName "your RG name" -Name "pipelineName" -DefinitionFile "path to json file"
Replace with your Data factory and resource group name.
The same arguments are used to upload linked services and datasets with the commands:
Set-AzureRmDataFactoryV2LinkedService
Set-AzureRmDataFactoryV2Dataset
Hope this helped!
来源:https://stackoverflow.com/questions/48931495/upload-adf-json-files-to-my-data-factory