azure-data-factory

Azure Data Factory Copy Identity Column With Gaps

删除回忆录丶 提交于 2019-12-01 22:11:07
I created a pipeline and two linked services to move data from an on-prem instance of SQL Server to an Azure Sql instance. The issue I'm running into is that we have a table "Table-1" in our on-prem with an Identity (1,1) column that is missing a sequential ID (e.g. the values are 1, 2, 3, 4, 6). When the pipeline runs, it tries to insert the rows with the ID's 1, 2, 3, 4, 5 which is a big problem because ID 6 is a foreign key on another table "Table-2" and now it doesn't exist, so the movement of data to Table-2 fails with SQL Error 547 (Insert statement conflicted with the foreign key

Azure Data Factory V2 Dataset Dynamic Folder

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-01 13:58:35
In Azure Data Factory (V1) I was able to create a slide and store the output to a specific folder (i.e. {Year}/{Month}/{Day}. See code below. How do you create the same type of slice in Azure Data Factory V2? I did find that you have to create a paramater. Yes, I was unable to figure out how to pass the parameter. "folderPath": "@{dataset().path}", "parameters": { "path": { "type": "String" Here is original ADF V1 code. { "name": "EMS_EMSActivations_L1_Snapshot", "properties": { "published": false, "type": "AzureDataLakeStore", "linkedServiceName": "SalesIntelligence_ADLS_LS", "typeProperties"

Azure Data Factory V2 Dataset Dynamic Folder

冷暖自知 提交于 2019-12-01 12:08:08
问题 In Azure Data Factory (V1) I was able to create a slide and store the output to a specific folder (i.e. {Year}/{Month}/{Day}. See code below. How do you create the same type of slice in Azure Data Factory V2? I did find that you have to create a paramater. Yes, I was unable to figure out how to pass the parameter. "folderPath": "@{dataset().path}", "parameters": { "path": { "type": "String" Here is original ADF V1 code. { "name": "EMS_EMSActivations_L1_Snapshot", "properties": { "published":

Execute stored procedure from Azure datafactory

我们两清 提交于 2019-12-01 11:41:49
问题 I am trying to execute a stored procedure in an Azure SQL database from an Azure DataFactory V2. The procedure will do some upsert into different tables with data from a flat table. According to MS specifications you need to have a table valued parameter to make such thing, but that couples the pipeline activity to the procedure and to all the models. Is there any way to define the dataset and copy activity so it just executes the stored procedure? The jsons below are from the arm template:

Transfer file from Azure Blob Storage to Google Cloud Storage programmatically

安稳与你 提交于 2019-12-01 08:32:56
I have a number of files that I transferred into Azure Blob Storage via the Azure Data Factory. Unfortunately, this tool doesn't appear to set the Content-MD5 value for any of the values, so when I pull that value from the Blob Storage API, it's empty. I'm aiming to transfer these files out of Azure Blob Storage and into Google Storage. The documentation I'm seeing for Google's Storagetransfer service at https://cloud.google.com/storage/transfer/reference/rest/v1/TransferSpec#HttpData indicates that I can easily initiate such a transfer if I supply a list of the files with their URL, length in

Transfer file from Azure Blob Storage to Google Cloud Storage programmatically

心不动则不痛 提交于 2019-12-01 05:47:12
问题 I have a number of files that I transferred into Azure Blob Storage via the Azure Data Factory. Unfortunately, this tool doesn't appear to set the Content-MD5 value for any of the values, so when I pull that value from the Blob Storage API, it's empty. I'm aiming to transfer these files out of Azure Blob Storage and into Google Storage. The documentation I'm seeing for Google's Storagetransfer service at https://cloud.google.com/storage/transfer/reference/rest/v1/TransferSpec#HttpData

How to integrate a WebJob within an Azure Data Factory Pipeline

微笑、不失礼 提交于 2019-12-01 01:51:18
I'am trying to integrate a WebJob inside an ADF pipeline. The webjob is a very simple console application: namespace WebJob4 { class ReturnTest { static double CalculateArea(int r) { double area = r * r * Math.PI; return area; } static void Main() { int radius = 5; double result = CalculateArea(radius); Console.WriteLine("The area is {0:0.00}", result); } } } How do we call this webjob through an ADF pipeline and store the response code (HTTP 200 in case of Success) in azure blob storage? Dec 2018 Update : If you are thinking of doing this using azure function, azure data factory NOW provides

How to integrate a WebJob within an Azure Data Factory Pipeline

邮差的信 提交于 2019-11-30 21:00:48
问题 I'am trying to integrate a WebJob inside an ADF pipeline. The webjob is a very simple console application: namespace WebJob4 { class ReturnTest { static double CalculateArea(int r) { double area = r * r * Math.PI; return area; } static void Main() { int radius = 5; double result = CalculateArea(radius); Console.WriteLine("The area is {0:0.00}", result); } } } How do we call this webjob through an ADF pipeline and store the response code (HTTP 200 in case of Success) in azure blob storage? 回答1

The subscription is not registered to use namespace 'Microsoft.DataFactory error

冷暖自知 提交于 2019-11-30 18:06:29
Going through this tutorial "Create a pipeline with Copy Activity using Visual Studio" and recieving this error when i hit publish. Creating datafactory-Name:VSTutorialFactory,Tags:,Subscription:Pay-As-You-Go,ResourceGroup:MyAppGroup,Location:North Europe, 24/03/2016 11:30:34- Error creating data factory: Microsoft.WindowsAzure.CloudException: MissingSubscriptionRegistration: The subscription is not registered to use namespace 'Microsoft.DataFactory'. Error not mentioned anywhere on net and very little help/knowledge on azure generally on web. In Azure, for each functionality there's a

Azure data factory | incremental data load from SFTP to Blob

冷暖自知 提交于 2019-11-30 16:38:18
I created a (once run) DF (V2) pipeline to load files (.lta.gz) from a SFTP server into an azure blob to get historical data. Worked beautifully. Every day there will be several new files on the SFTP server (which cannot be manipulated or deleted). So I want to create an incremental load pipeline which checks daily for new files - if so ---> copy new files. Does anyone have any tips for me how to achieve this? Thanks for using Data Factory! To incrementally load newly generated files on SFTP server, you can leverage the GetMetadata activity to retrieve the LastModifiedDate property: https:/