azure-data-factory-2

How can I get the last day of a month in dynamic content in ADF2?

拥有回忆 提交于 2020-01-04 06:39:08
问题 I want to get the last day of a month based on the utcnow() timestamp. Instead of "dd" in the expression bellow there should be automatically the last day of the month (28, 30 or 31): @{formatDateTime(adddays(utcnow(),-2), 'yyyy-MM-ddT23:59:59.999')} Thinking that it´s actually august I expect the following result out of the expression: "2019-08-31T23:59:59.999" 回答1: I would recommend the simplest way to do this is store the dates and their respective end of month dates in a table or file (eg

Change connection string Linked Service in Azure Data Factory v2

|▌冷眼眸甩不掉的悲伤 提交于 2020-01-03 19:17:24
问题 I am using Azure Data Factory V2 to integrate data from multiple on-premise mySql database. Is it possible to define just one mysql linked service and then modify the connection string (server name, credential, integration runtime) during runtime. My plan is to use lookup activity to read list of connection strings and then use for-each activity to iterate over that list to pull data from each database using copy activity. Is it possible to do such things, preferably using the Azure data

Change connection string Linked Service in Azure Data Factory v2

。_饼干妹妹 提交于 2020-01-03 19:17:17
问题 I am using Azure Data Factory V2 to integrate data from multiple on-premise mySql database. Is it possible to define just one mysql linked service and then modify the connection string (server name, credential, integration runtime) during runtime. My plan is to use lookup activity to read list of connection strings and then use for-each activity to iterate over that list to pull data from each database using copy activity. Is it possible to do such things, preferably using the Azure data

Azure Data flow taking mins to trigger next pipeline

人盡茶涼 提交于 2020-01-03 03:09:07
问题 Azure Data factory transferring data in Db in 10 millisecond but the issue I am having is it is waiting for few mins to trigger next pipeline and that ends up with 40 mins all pipelines are taking less than 20 ms to transfer data. But somehow it is waiting a few mins to trigger the next one. I used debug mode as well trigger the ADF using Logic App without debugging mood. Is there any way I can optimize it we want to move from SSIS to Data Flow but having a time issue 40 mins are so much in

How to provide connection string dynamically for azure table storage/blob storage in Azure data factory Linked service

岁酱吖の 提交于 2019-12-31 05:28:42
问题 Dynamically changing the connection string for Tablestorage or blob storage in Azure data factory. Currently, I could see such option for database related dataset? How to achieve the same in Table or Blob storage 回答1: In the New Linked service Azure table storage and Click on Advanced and check Specify Dynamic contents in JSON format adf Copy the below JSON to make it Table Storage Parameterize : { "name": "Table", "type": "Microsoft.DataFactory/factories/linkedservices", "properties": {

Azure DataFactory Incremental BLOB copy

拈花ヽ惹草 提交于 2019-12-25 04:00:11
问题 I've made a pipeline to copy data from one blob storage to another. I want to have incremental copy if it's possible, but haven't found a way to specify it. The reason is I want to run this on a schedule and only copy any new data since last run. 回答1: If your blob name is well named with timestamp, you could follow this doc to copy partitioned data. You could use copy data tool to setup the pipeline. You could select tumbling window and then in file path filed input {year}/{month}/{day}

shaping json data in the sink

自作多情 提交于 2019-12-25 01:38:56
问题 How do union these two streams into a single JSON output using data factory/data flow? I have two streams of data. Stream 1 (csv) : 123,alex,03/18/1985 Stream 2 (csv) : 123,blue,new 123,purple,old Desired output: { "Stream1": { "id": 123, "name": "alex", "dob": "03/18/1985" }, "Stream2": [ { "id": 123, "color": "blue", "status": "new" }, { "id": 123, "color": "purple", "status": "old" } ] } How do union these two streams into a single JSON output? 来源: https://stackoverflow.com/questions

Azure Data Factory - Dynamic Account information - Parameterization of Connection

て烟熏妆下的殇ゞ 提交于 2019-12-24 20:27:12
问题 The documentation demonstrates how to create a parameter for a connected service but not how to actual pass in that parameter from a dataset or activity. Basically the connection string is coming from a lookup foreach loop and I want to connect to a storage table. The connection looks like this. The test works when passing in a correct parameter: { "name": "StatsStorage", "properties": { "type": "AzureTableStorage", "parameters": { "connectionString": { "type": "String" } }, "annotations": []

when try to add dynamic file name (linked server) in azure data factory V2 error encountered

依然范特西╮ 提交于 2019-12-24 15:53:05
问题 I am new to the Azure Data factory V2 and blob storage. When try to add the file connection(linked server) in copy data from blob storage dynamically the following error is encountered while trying to map the columns by importing the schema from file "Failed to convert the value in 'container' property to 'System.String' type. Please make sure the payload structure and value are correct.." I tried: used static parameters and assign the static parameters to the linked connection 回答1: Please

Set Variable activity Azure Data Factory v2

依然范特西╮ 提交于 2019-12-24 11:35:55
问题 I am trying this new functionality, and when I try to use set variable activity inside foreach loop I cannot select a variable that I declared in a pipeline. Also inside IF activity. Is it supposed to behave like this? That you cant set variable inside some inner activities, only at the root level of the pipeline? 回答1: This is a known bug, where set variables and append variables activities are not correctly detecting changes when they're nested in another activity. Actively working on a fix