azure-data-factory

Azure Data Factory get data for “For Each”component from query

人走茶凉 提交于 2019-12-24 17:18:15
问题 The situation is as follows: I have a table in my database that recieves about 3 million rows each day. We want to archive this table on a regular base, so that only the 8 most recents weeks are in the table. The rest of the data can be archived tot AZure Data lake. I allready found out how to do this by one day at a time. But now I want to run this pipeline each week for the first seven days in the table. I assume I should do this with the "For Each" component. It should itterate along the

How to Scheduling a Pipeline in Azure Data Factory

会有一股神秘感。 提交于 2019-12-24 16:22:05
问题 I followed the tutorials of MS and created a pipeline to move data from the on premise SQL Tablet to Azure Blob and then there's another pipeline to move the blob data to azure sql table. From the document provided, I need to specify the active period (start time and end time) for the pipeline and everything run well. The question is what can I do if I want the pipeline to be activated every 3 hours, until I manually stop the operation. Currently I need to change the start time and end time

Azure Data Factory Copy Data dynamically get last blob

岁酱吖の 提交于 2019-12-24 11:02:02
问题 I have an Azure Data Factory Pipeline that runs on a Blob Created Trigger, I want it to grab the last Blob added and copy that to the desired location. How do I dynamically generate the file path for this outcome? System Variables Expressions and Functions 回答1: " @triggerBody().folderPath " and " @triggerBody().fileName " captures the last created blob file path in event trigger. You need to map your pipeline parameter to these two trigger properties. Please follow this link to do the

SQL Server complains about invalid json

走远了吗. 提交于 2019-12-24 10:38:30
问题 I am writing an ETL tool using Azure Data Factory and Azure SQL Database. The Data Factory captures the output of a Mapping Data Flow and inserts it into the StatusMessage column of a SQL Server table (Audit.OperationsEventLog) as a string. The StatusMessage column is varchar(8000) and is intended to store data formatted as valid json. SELECT * FROM Audit.OperationsEventLog lg CROSS APPLY OPENJSON(lg.StatusMessage) dt When I query the json string from the table using the query above, it

On-Prem SQL connection throwing SqlException in Datafactory custom activity

自古美人都是妖i 提交于 2019-12-24 09:29:20
问题 I have added code for Azure datafactory custom activity in Azure batch service and pointed the datafactory pipeline to the bacth service. When I execute the code in local environment, it works fine. But when I upload it to run in azure batch service, it throws and sqlexception System.Data.SqlClient.SqlException: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is

how we get Data Factory logging information

你说的曾经没有我的故事 提交于 2019-12-24 08:29:01
问题 how we get Data Factory logging information. do Microsoft done any documentation. i need complete information when i run pipeline i.e Start time , end time, pipeline job id, no of record inserted, deleted, update, error, etc 回答1: ADF doesn't currently write to the Azure Activity Logs, meaning you can't access details using the Azure Monitor. Currently the best way I find to get this information is using PowerShell. For example: Get-AzureRmDataFactoryActivityWindow ` -DataFactoryName $ADFName

How to copy data in Azure Data Factory depending on the values?

ⅰ亾dé卋堺 提交于 2019-12-24 06:07:51
问题 my problem is this: For example, I have a table with three columns in SQL Server table1(id, number1, number2) and other table2(id,finalNumber). How can I do a conditional copy? I want to copy the bigger number of each row in the table2(finalNumber). I thought a LookUp->IfConditionally but it doesn't work. 回答1: From the example in the IfCondition activity document,it could be used to quote the output data from look up activities.Like @{activity('MyLookupActivity').output.firstRow.count} . You

How to copy data in Azure Data Factory depending on the values?

余生颓废 提交于 2019-12-24 06:07:15
问题 my problem is this: For example, I have a table with three columns in SQL Server table1(id, number1, number2) and other table2(id,finalNumber). How can I do a conditional copy? I want to copy the bigger number of each row in the table2(finalNumber). I thought a LookUp->IfConditionally but it doesn't work. 回答1: From the example in the IfCondition activity document,it could be used to quote the output data from look up activities.Like @{activity('MyLookupActivity').output.firstRow.count} . You

Azure Data Factory .NET SDK activity metrics

自古美人都是妖i 提交于 2019-12-24 05:40:30
问题 Does anyone know how to (or if it possible) to access these metrics for each activity run shown under "Details" in the Azure Portal? The initial plan was to use the .NET SDK but none of these metrics seems to be included. This is what I have managed to find so far. var datasliceRunListResponse = client.DataSliceRuns.List( _resourceGroupName, dataFactoryName, Dataset_Destination, new DataSliceRunListParameters() { DataSliceStartTime = PipelineActivePeriodStartTime

How can I exclude rows in a Copy Data Activity in Azure Data Factory?

风流意气都作罢 提交于 2019-12-24 02:15:30
问题 I have built an Pipeline with one Copy Data activity which copies data from an Azure Data Lake and output it to an Azure Blob Storage . In the output, I can see that some of my rows do not have data and I would like to exclude them from the copy. In the following example, the 2nd row does not have useful data: {"TenantId":"qa","Timestamp":"2019-03-06T10:53:51.634Z","PrincipalId":2,"ControlId":"729c3b6e-0442-4884-936c-c36c9b466e9d","ZoneInternalId":0,"IsAuthorized":true,"PrincipalName":"John",