azure-data-factory

Azure Data factory copy activity failed mapping strings (from csv) to Azure SQL table sink uniqueidentifier field

倾然丶 夕夏残阳落幕 提交于 2019-12-20 04:49:55
问题 I have an Azure data factory (DF) pipeline that consists a Copy activity. The Copy activity uses HTTP connector as source to invoke a REST end-point and returns csv stream that sinks with Azure SQL Database table. The Copy fails when CSV contains strings (such as 40f52caf-e616-4321-8ea3-12ea3cbc54e9 ) which are mapped to an uniqueIdentifier field in target table with error message The given value of type String from the data source cannot be converted to type uniqueidentifier of the specified

use adf pipeline parameters as source to sink columns while mapping

与世无争的帅哥 提交于 2019-12-20 03:52:13
问题 I have an ADF pipeline with copy activity, I'm copying data from blob storage CSV file to SQL database, this is working as expected. I need to map Name of the CSV file (this coming from pipeline parameters) and save it in the destination table. I'm wondering if there is a way to map parameters to destination columns. 回答1: Column name can't directly use parameters. But you can use parameter for the whole structure property in dataset and columnMappings property in copy activity. This might be

No Pipeline Diagrams in Azure Data Factory

荒凉一梦 提交于 2019-12-20 03:19:48
问题 I have created 2 pipelines using the Copy data wizard, but neither show up when I click on the Diagram action. I get the message "This factory contains no pipelines or datasets". The pipelines both run successfully, and do show up when clicking on Pipelines blade. Both pipelines were configured as One time. Any ideas on why this is happening and how to fix it? 回答1: The Monitor & Manage view for Azure Data Factory does not currently show the diagram for "run once" (aka oneTime or once-only)

Azure Data Factory Copy Identity Column With Gaps

可紊 提交于 2019-12-20 02:48:11
问题 I created a pipeline and two linked services to move data from an on-prem instance of SQL Server to an Azure Sql instance. The issue I'm running into is that we have a table "Table-1" in our on-prem with an Identity (1,1) column that is missing a sequential ID (e.g. the values are 1, 2, 3, 4, 6). When the pipeline runs, it tries to insert the rows with the ID's 1, 2, 3, 4, 5 which is a big problem because ID 6 is a foreign key on another table "Table-2" and now it doesn't exist, so the

Azure data factory | incremental data load from SFTP to Blob

帅比萌擦擦* 提交于 2019-12-18 18:30:33
问题 I created a (once run) DF (V2) pipeline to load files (.lta.gz) from a SFTP server into an azure blob to get historical data. Worked beautifully. Every day there will be several new files on the SFTP server (which cannot be manipulated or deleted). So I want to create an incremental load pipeline which checks daily for new files - if so ---> copy new files. Does anyone have any tips for me how to achieve this? 回答1: Thanks for using Data Factory! To incrementally load newly generated files on

Error while running U-SQL Activity in Pipeline in Azure Data Factory

*爱你&永不变心* 提交于 2019-12-18 05:12:30
问题 I am getting following error while running a USQL Activity in the pipeline in ADF: Error in Activity: {"errorId":"E_CSC_USER_SYNTAXERROR","severity":"Error","component":"CSC", "source":"USER","message":"syntax error. Final statement did not end with a semicolon","details":"at token 'txt', line 3\r\nnear the ###:\r\n**************\r\nDECLARE @in string = \"/demo/SearchLog.txt\";\nDECLARE @out string = \"/scripts/Result.txt\";\nSearchLogProcessing.txt ### \n", "description":"Invalid syntax

Using Azure Data Factory to get data from a REST API

微笑、不失礼 提交于 2019-12-18 03:41:29
问题 Is it possible to use Azure Data Factory to get data from a REST API and insert it to a Azure database table? 回答1: Data factory offers a generic HTTP connector and a specific REST connector, allowing you to do retrieve data from HTTP endpoints by using GET or POST methods. Example: HTTP Linked Service { "name": "HttpLinkedService", "properties": { "type": "Http", "typeProperties": { "authenticationType": "Anonymous", "url" : "https://en.wikipedia.org/wiki/" } } } 回答2: I have done this using

Using Azure Data Factory to get data from a REST API

有些话、适合烂在心里 提交于 2019-12-18 03:41:07
问题 Is it possible to use Azure Data Factory to get data from a REST API and insert it to a Azure database table? 回答1: Data factory offers a generic HTTP connector and a specific REST connector, allowing you to do retrieve data from HTTP endpoints by using GET or POST methods. Example: HTTP Linked Service { "name": "HttpLinkedService", "properties": { "type": "Http", "typeProperties": { "authenticationType": "Anonymous", "url" : "https://en.wikipedia.org/wiki/" } } } 回答2: I have done this using

Add file name as column in data factory pipeline destination

我与影子孤独终老i 提交于 2019-12-17 20:40:32
问题 I am new to DF. i am loading bunch of csv files into a table and i would like to capture the name of the csv file as a new column in the destination table. Can someone please help how i can achieve this ? thanks in advance 回答1: If your destination is azure table storage, you could put your filename into partition key column. Otherwise, I think there is no native way to do this with ADF. You may need custom activity or stored procedure. 回答2: A post said the could use data bricks to handle this

How to copy files from sharepoint into blob storage azure data factory v2 using Odata linked service

邮差的信 提交于 2019-12-13 23:42:25
问题 Can any one help in understanding procedure of copying excel file from share point to azure Blob storage through Azure Data Factory pipelines. I am struggling while creating Odata linked service. What is service url in odata linked service? I am using rest apis provided in link here as a service url :https://xxxxx.sharepoint.com/sites/xxx/_api/web/ authentication type: basic when I test connection I outputs a weird error: here I have tried the following articles so far. https://docs.microsoft