azure-data-factory-2

how to export pipeline in datafactory v2 or migrate to another

馋奶兔 提交于 2019-12-07 10:04:00
问题 I'm trying export one pipeline created in datafactory v2 or migrate to another, but not found the option, Could you help me please 回答1: As I know, you could learn about Continuous Integration in Azure Data Factory. You could find below statement in the Continuous integration and deployment in Azure Data Factory. For Azure Data Factory, continuous integration & deployment means moving Data Factory pipelines from one environment (development, test, production) to another. To do continuous

How to read files with .xlsx and .xls extension in Azure data factory?

走远了吗. 提交于 2019-12-07 04:44:15
问题 I am trying to read and excel file in Azure Blob Storage with .xlsx extension in my azure data factory dataset. it throws following error Error found when processing 'Csv/Tsv Format Text' source 'Filename.xlsx' with row number 3: found more columns than expected column count: 1. What are the right Column and row delimiters for excel files to be read in azure Data factory 回答1: Excel files have a proprietary format and are not simple delimited files. As indicated here, Azure Data Factory does

Enumerate blob names in Azure Data Factory v2

眉间皱痕 提交于 2019-12-06 15:10:22
I need to enumerate all the blob names that sit in an Azure Blobs container and dump the list to a file in another blob storage. The part that I cannot master is the enumeration. Thanks. Get metadata activity is what you want. https://docs.microsoft.com/en-us/azure/data-factory/control-flow-get-metadata-activity Please use childItems to get all the files. And then use a foreach to iterate the childItems Inside the for each activity, you may want to check if each item is a file. You could use if activity and the following expression. Then in the "If true" activity, assume you want to copy data,

Copy Data From Azure Blob Storage to AWS S3

霸气de小男生 提交于 2019-12-06 02:25:38
I am new to Azure Data Factory and have an interesting requirement. I need to move files from Azure Blob storage to Amazon S3, ideally using Azure Data Factory. However S3 isnt supported as a sink; https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-overview I also understand from a variety of comments i've read on here that you cannot directly copy from Blob Storage to S3 - you would need to download the file locally and then upload it to S3. Does anyone know of any examples, in Data factory, SSIS or Azure Runbook that can do such a thing, I suppose an option would be to write

How to read files with .xlsx and .xls extension in Azure data factory?

与世无争的帅哥 提交于 2019-12-05 08:48:52
I am trying to read and excel file in Azure Blob Storage with .xlsx extension in my azure data factory dataset. it throws following error Error found when processing 'Csv/Tsv Format Text' source 'Filename.xlsx' with row number 3: found more columns than expected column count: 1. What are the right Column and row delimiters for excel files to be read in azure Data factory Excel files have a proprietary format and are not simple delimited files. As indicated here , Azure Data Factory does not have a direct option to import Excel files, eg you cannot create a Linked Service to an Excel file and

Data Factory v2 - Generate a json file per row

允我心安 提交于 2019-12-02 18:11:48
问题 I'm using Data Factory v2. I have a copy activity that has an Azure SQL dataset as input and a Azure Storage Blob as output. I want to write each row in my SQL dataset as a separate blob, but I don't see how I can do this. I see a copyBehavior in the copy activity, but that only works from a file based source. Another possible setting is the filePattern in my dataset: Indicate the pattern of data stored in each JSON file. Allowed values are: setOfObjects and arrayOfObjects. setOfObjects -

Azure Data Factory select property “status”: “Succeeded” from previous activity

旧时模样 提交于 2019-12-02 17:56:14
问题 with Data Factory V2 I'm trying to implement a stream of data copy from one Azure SQL database to another. I would like to perform a conditional activity If Condition depends on the success of the previous activities execute by the pipeline, but in the expression to be included in the activity of If Condition I can not select the output property "status": "Succeeded" . Before the activity of If Condition I have two data copy activities. I added an If Condition activity to my flow because the

Modify global parameter ADF pipeline

丶灬走出姿态 提交于 2019-12-02 14:38:10
问题 How can I modify the value of a global parameter declared in a pipeline of an ADF? Let's say I need to check whether or not a file in a ADLS exists. I declare a boolean global parameter, but according to my logic inside a U-SQL activity I need to modify its value. How can I do that? Thanks!!! 回答1: U-SQL's script parameter model only provides input parameters and no output parameters. If you want to communicate something back, you currently have to do this via a file. E.g., you write the file

Data Factory v2 - Generate a json file per row

感情迁移 提交于 2019-12-02 10:01:39
I'm using Data Factory v2. I have a copy activity that has an Azure SQL dataset as input and a Azure Storage Blob as output. I want to write each row in my SQL dataset as a separate blob, but I don't see how I can do this. I see a copyBehavior in the copy activity, but that only works from a file based source. Another possible setting is the filePattern in my dataset: Indicate the pattern of data stored in each JSON file. Allowed values are: setOfObjects and arrayOfObjects. setOfObjects - Each file contains single object, or line-delimited/concatenated multiple objects. When this option is

How to provide connection string dynamically for azure table storage/blob storage in Azure data factory Linked service

扶醉桌前 提交于 2019-12-02 09:23:45
Dynamically changing the connection string for Tablestorage or blob storage in Azure data factory. Currently, I could see such option for database related dataset? How to achieve the same in Table or Blob storage I believe this is what you wanted. https://docs.microsoft.com/en-us/azure/data-factory/parameterize-linked-services As doc mentioned, UI only supports 8 linked service. For others, you could change json code directly following the same pattern. { "name": "AzureBlobStorage12", "type": "Microsoft.DataFactory/factories/linkedservices", "properties": { "parameters": { "accountName": {