azure-data-factory-2

In DataFactory, what is a good strategy to migrate data into Dynamics365 using Dynamics Web API?

人走茶凉 提交于 2019-12-02 08:34:19
I need to migrate data using DataFactory to Dynamics365. The Dynamics365 connector is not enough for me since one of the requirements is to only update those attributes that have been modified since last migration - not the whole register. The other requirement is that sometimes we have to 'null' values in destination. I believe that I can do that by generating a different JSON for register and migrate them using the Web API. I thought in putting these calls in an Azure Functions, but I believe that they are not meant to be used like this - even though with the right pricing plan they can run

use adf pipeline parameters as source to sink columns while mapping

一笑奈何 提交于 2019-12-02 04:40:20
I have an ADF pipeline with copy activity, I'm copying data from blob storage CSV file to SQL database, this is working as expected. I need to map Name of the CSV file (this coming from pipeline parameters) and save it in the destination table. I'm wondering if there is a way to map parameters to destination columns. Column name can't directly use parameters. But you can use parameter for the whole structure property in dataset and columnMappings property in copy activity. This might be a little tedious as you will need to write the whole structure array and columnMappings on your own and pass

Azure data factory | incremental data load from SFTP to Blob

冷暖自知 提交于 2019-11-30 16:38:18
I created a (once run) DF (V2) pipeline to load files (.lta.gz) from a SFTP server into an azure blob to get historical data. Worked beautifully. Every day there will be several new files on the SFTP server (which cannot be manipulated or deleted). So I want to create an incremental load pipeline which checks daily for new files - if so ---> copy new files. Does anyone have any tips for me how to achieve this? Thanks for using Data Factory! To incrementally load newly generated files on SFTP server, you can leverage the GetMetadata activity to retrieve the LastModifiedDate property: https:/

Azure Data Factory mapping 2 columns in one column

戏子无情 提交于 2019-11-27 09:38:45
Can somebody help me solving the error am getting in concatenating the two columns i.e first name and last name from my text file and merging the two columns into one name column in my Azure SQL database as a sink in Azure Data Factory and another question is that I want to choose the first letter of the column gender that is M or F for male and female respectively from the source text file and changing it to one letter M or F in my gender column in the Azure data factory pipeline enter image description here ? Update 1 My table name is [dbo].[Contact] and after applying this procedure am