azure-data-factory

Run U-SQL Script from C# code with Azure Data Factory

霸气de小男生 提交于 2019-12-12 03:56:47
问题 I am trying to Run an U-SQL script on Azure by C# code. Everything is created on azure (ADF, linkedservices, pipelines, data sets) after code gets executed but U-SQl script is not executed by ADF. I think there is an issue with startTime and end Time configured in pipeline code. I followed following article to complete this console application. Create, monitor, and manage Azure data factories using Data Factory .NET SDK Here is the URL of my complete C# code project for download. https://1drv

Not able to use SSIS package to Sync Data from MySql to AzureSql using ADF-V2

。_饼干妹妹 提交于 2019-12-11 18:17:14
问题 As we can run SSIS package on ADF-v2. So I've prepared an SSIS package to Sync records between Mysql and Azure SQL and getting below error in reports I'm tried using ODBC and ADO.Net connection but getting same result when trying executing from SSIS catalog. I'm able to sync records when i tried executing package from SSDT Question : Can't we use SSIS in ADF-v2, other than Azure cloud connecters? Also please suggest if there any steps i'm missing. Reference links used to implement same Link1

Upload ADF json files to my Data Factory

ぃ、小莉子 提交于 2019-12-11 17:53:07
问题 I have a number of pipeline/linkedservice/dataset json files and I need to upload them to my Data Factory, opposed to creating new versions and copying the text over. Whats the simplest way to do this? 回答1: If you are using version 1, you can use Visual Studio to do so as shown here https://azure.microsoft.com/en-us/blog/azure-data-factory-visual-studio-extension-for-authoring-pipelines/ If you are using version 2, you can do this using powershell. First download and install the azure sdk for

azure data factory: how to merge all files of a folder into one file

被刻印的时光 ゝ 提交于 2019-12-11 17:38:56
问题 I need to create a big file, by merging multiple files scattered in several subfolders contained in an Azure Blob Storage, also a transformation needs to be done, each file contains a JSON array of a single element, so the final file, will contain an array of JSON elements. The final purpose is to process that Big file in a Hadoop & MapReduce job. The layout of the original files is similar to this: folder - month-01 - day-01 - files... - month-02 - day-02 - files... 回答1: I did a test based

Azure Data Factory V1

点点圈 提交于 2019-12-11 17:17:48
问题 Is it possible to trigger a pipeline in ADF v1 using Powershell script? I found this command "Resume-AzureRmDataFactoryPipeline" to trigger the pipeline, but it does not really start the pipeline.. Please advise. 回答1: It really depends on what your pipeline does, but an alternative method is setting the status of a slice to waiting, with the following powershell cmdlet: $StartDateTime = (Get-Date).AddDays(-7) $ResourceGroupName = "YourRGName" $DSName = "YourDatasetName" $DataFactoryV1Name =

Azure Data Factory On-Premises Copy Error

空扰寡人 提交于 2019-12-11 16:54:43
问题 I am trying to schedule an on-premises copy job that contains a SQL Server. However I am getting a different kind of error when trying to enter the sql server credentials. Type=Microsoft.Data.Mashup.InternalMashupException.Message..sorry,en error occurred during evaluation.,Source=Microsoft.Data.Mashup"Type=Microsoft.Mas..data protection operation was unsuccessful. This may have been caused by not having the user profile loaded for the current thread's user context which may be the case when

Multiple failed dependencies in Azure Data Factory activity 'dependsOn'

牧云@^-^@ 提交于 2019-12-11 16:44:37
问题 When there are multiple activity dependencies ("dependsOn") conditions in an Azure Data Factory control activity do they all need to be true for the activity to run? For example, if a clean-up activity should run if any other activity fails there can be several dependencies with a "dependencyCondition" of "failed". "dependsOn": [ { "activity": "FirstActivity", "dependencyConditions": [ "Failed" ] }, { "activity": "SecondActivity", "dependencyConditions": [ "Failed" ] } ] When there are

Does Incremental Sync with Azure Data Factory V2 support only Sql Server to Azure SQL

老子叫甜甜 提交于 2019-12-11 16:04:02
问题 I was trying incremental sync to design incremental sync data between MySQL and Azure Sql referring article and while designing pipeline for new watermark found that lookup component only support SQL SERVER. Question Is there a way to sync incrementally on cloud from hosted Mysql to Azure SQl using ADF v1/v2? What other component we can use to select data from Mysql i tried Store Procedure but it support Sql server only. 回答1: We (ADF team) are actively working on expanding Lookup activity to

Copy output data from Custom Activity Powershell script

会有一股神秘感。 提交于 2019-12-11 15:55:06
问题 I have created a Custom Activity on ADF v2 that runs a Powershell script running the command “powershell .\script.ps1”. Actually, the output of the script is saved on “StaName/adfjobs/activityRunId/stdout.txt” but I need to store the file in another container on the same StorageAccount, for exemple “StaName/outputs/stdout.txt”. ¿What’s the best way to perform this? Create a Copy activity to copy the file? or there’s some method to send data directly to “StaName/outputs/” through the

How to cancel pipeline — No cancel button is available

爷,独闯天下 提交于 2019-12-11 15:49:07
问题 I have a pipeline running (8735cc10-80db-4401-8f9e-516d733b450e). From the activity runs page, I see the pipeline is running, but from the Pipleline Runs page, it shows a status of failure. Is there a way to cancel this pipeline run from the UI ? 回答1: The easiest way to do this is from powershell. Download the Azure sdk for powershell from here: https://azure.microsoft.com/en-us/downloads/ then run this: Login-AzureRmAccount Select-AzureRmSubscription -SubscriptionName "SubscName" Stop