问题
I am working on creating an entire CI/CD pipeline for my Data factory V1 project I am using VSTS for the implementation.
I am able to carry out most of my task over VSTS which are required for deployment, However, I am not able to determine if it is possible to completely implement continuous deployment over my project.
I have one common solution file responsible to hold 4 different data factory project & each project holds 4 data flow pipelines each.
The issue is not every time is the case that the entire solution is deployed to a higher environment. we have specific pipelines to be deployed over each deployment.
Is it possible to deploy specific pipelines i.e picking one from each project and deploy them ahead using the release pipeline.
If yes, how are we going to implement it in the vsts release pipeline & will it be incremental?
回答1:
Please take a look at this blog which maybe helpful. The author is using a comparable method for deployment. Before deploying the JSON files using a PowerShell command, and edit them to insert environment specific values into the Data Factory definitions. You can pass these values as parameters from the VSTS deployment-pipeline.
Also take a look at this blog: Deploy Azure Data Factory using PowerShell
If you want to control the specific pipelines, you could use some 3rd-party extensions such as this one-- Azure Data Factory
Azure Data Factory Pipelines Management, this release task can be added to a release pipeline to either suspend or resume all pipelines of an Azure Data Factory.
回答2:
This is a doc for ADF V2 Continuous integration and deployment with ADF V2 UI, it will allow you bind a VSTS repository to your Azure Data Factory.
来源:https://stackoverflow.com/questions/51084717/ci-cd-pipeline-for-data-factory-v1-using-vsts