azure-data-factory-2

Use dynamic value as table name of a table storage in Azure Data Factory

我的未来我决定 提交于 2020-06-17 14:19:05
问题 I have an ADF pipeline that uses copy data activity for copying data from blob storage to table storage. This pipeline runs on a trigger once every day. I have provided a table name in table storage data set as 'Table1'. Instead of providing a hard coded table name value (Table1), is it possible to provide a dynamic value as table name in the table storage such that RUN ID of pipeline run is used as the table name in the table storage and copy data from blob to that table in table storage?

How to target a devops branch when script creating objects in ADFv2?

不羁的心 提交于 2020-06-12 17:51:02
问题 Using azure data factory v2 with GIT / Azure DevOps integration: If you for example create a trigger using Set-AzDataFactoryV2Trigger via powershell according to the documentation, the trigger is created directly in the adf_publish branch. This is an issue, as this will result in a mismatch between the master branch and adf_publish, meaning you'll not be able to publish going forward as this of course raises an error. How do I get the cmdlet to create the trigger in a new or specific branch,

How to target a devops branch when script creating objects in ADFv2?

蹲街弑〆低调 提交于 2020-06-12 17:48:51
问题 Using azure data factory v2 with GIT / Azure DevOps integration: If you for example create a trigger using Set-AzDataFactoryV2Trigger via powershell according to the documentation, the trigger is created directly in the adf_publish branch. This is an issue, as this will result in a mismatch between the master branch and adf_publish, meaning you'll not be able to publish going forward as this of course raises an error. How do I get the cmdlet to create the trigger in a new or specific branch,

Azure Data Factory Pipeline 'On Failure'

浪尽此生 提交于 2020-06-01 08:00:44
问题 I am setting up an ADF pipeline to copy blob into an Azure SQL DB. I have a Iteration activity in my pipeline, where I have set up a counter to loop and copy only if the blob exists. This works great except for some random PK violations, which I will have to check manually. So I edited my pipeline to log the error, and continue. So I set up the pipeline as such. If the copy activity fails due to Primary Key Violation, (for now) ignore, but log the details using a stored procedure and continue

Azure Data Factory Pipeline 'On Failure'

为君一笑 提交于 2020-06-01 08:00:12
问题 I am setting up an ADF pipeline to copy blob into an Azure SQL DB. I have a Iteration activity in my pipeline, where I have set up a counter to loop and copy only if the blob exists. This works great except for some random PK violations, which I will have to check manually. So I edited my pipeline to log the error, and continue. So I set up the pipeline as such. If the copy activity fails due to Primary Key Violation, (for now) ignore, but log the details using a stored procedure and continue

How to run SQL Script in Azure Data Factory v2?

a 夏天 提交于 2020-05-29 04:36:08
问题 There is NO Sql Script activity in Azure Data Factory V2. So how can I create a stored proc, a schema in a database? What are my options? 回答1: There is a preCopyScript property. You could put your script there. It will be executed before each run. You could use store procedure activity as Summit mentioned. You could also create a custom activity. 回答2: I agree that the absence of something like "Execute SQL task" is SSIS is bumming. I normally use a "LookUp" activity as I don't like to create

Converting XML files to JSON or CSV?

纵然是瞬间 提交于 2020-05-28 04:28:32
问题 I have complex XML files with nested elements. I built a process to handle using SSIS and T-SQL. We utilize Azure Data Factory and I'd like to explore converting XML files to JSON or CSV, since those are supported by ADF and XML is not. It appears logic apps is one option. Has anyone had other luck with taking XML and converting within a pipeline? Current Workflow: pick up XML files from folder, drop to on network drives, bulk insert XML into a staging row, parse XML to various SQL tables for

Converting XML files to JSON or CSV?

孤街醉人 提交于 2020-05-28 04:28:05
问题 I have complex XML files with nested elements. I built a process to handle using SSIS and T-SQL. We utilize Azure Data Factory and I'd like to explore converting XML files to JSON or CSV, since those are supported by ADF and XML is not. It appears logic apps is one option. Has anyone had other luck with taking XML and converting within a pipeline? Current Workflow: pick up XML files from folder, drop to on network drives, bulk insert XML into a staging row, parse XML to various SQL tables for

How to create a table in SQL Database from a CSV file in Blob which contain all the column name with its data type through Data Flow or ADF pipeline?

只愿长相守 提交于 2020-04-30 07:13:19
问题 I am having a CSV file in my Azure Blob Storage which contain all the column name with its data Data type of respective tables. I want to create a table in SQL Database from this Blob file with the same column name with its corresponding datatype without doing the mapping. I have created a table through data flow but I have to set the data type of each column manually. But I don't want to do this. When I create a table it should accept the same data types in the source as well as synch which

Cast values to string in Json Path Expression in Azure Data Factory copy activity

半世苍凉 提交于 2020-04-30 07:06:04
问题 I have an input JSON file where the actual value of the property could be either a numeric value or a string.I extract the value by specifying a json path expression like "fieldValue": "values[*].value" in the azure data factory copy activity, connection tab for the source. Since the actual field value in the JSON could be something like "X" or 2.34 it is not able parse it all into strings even though in the schema I specify the fieldValue as string. So is there a way I could cast it so that