Azure data factory | incremental data load from SFTP to Blob

帅比萌擦擦* 提交于 2019-12-18 18:30:33

问题


I created a (once run) DF (V2) pipeline to load files (.lta.gz) from a SFTP server into an azure blob to get historical data. Worked beautifully. Every day there will be several new files on the SFTP server (which cannot be manipulated or deleted). So I want to create an incremental load pipeline which checks daily for new files - if so ---> copy new files.

Does anyone have any tips for me how to achieve this?


回答1:


Thanks for using Data Factory!

To incrementally load newly generated files on SFTP server, you can leverage the GetMetadata activity to retrieve the LastModifiedDate property: https://docs.microsoft.com/en-us/azure/data-factory/control-flow-get-metadata-activity

Essentially you author a pipeline containing the following activities:

  • getMetadata (return list of files under a given folder)
  • ForEach (iterate through each file)
  • getMetadata (return lastModifiedTime for a given file)
  • IfCondition (compare lastModifiedTime with trigger WindowStartTime)
  • Copy (copy file from source to destination)

Have fun building data integration flows using Data Factory!




回答2:


since I posted my previous answer in May last year, many of you contacted me asking for pipeline sample to achieve the incremental file copy scenario using the getMetadata-ForEach-getMetadata-If-Copy pattern. This has been important feedback that incremental file copy is a common scenario that we want to further optimize.

Today I would like to post an updated answer - we recently released a new feature that allows a much easier and scalability approach to achieve the same goal:

You can now set modifiedDatetimeStart and modifiedDatetimeEnd on SFTP dataset to specify the time range filters to only extract files that were created/modified during that period. This enables you to achieve the incremental file copy using a single activity: https://docs.microsoft.com/en-us/azure/data-factory/connector-sftp#dataset-properties

This feature is enabled for these file-based connectors in ADF: AWS S3, Azure Blob Storage, FTP, SFTP, ADLS Gen1, ADLS Gen2, and on-prem file system. Support for HDFS is coming very soon.

Further, to make it even easier to author an incremental copy pipeline, we now release common pipeline patterns as solution templates. You can select one of the templates, fill out the linked service and dataset info, and click deploy – it is that simple! https://docs.microsoft.com/en-us/azure/data-factory/solution-templates-introduction

You should be able to find the incremental file copy solution in the gallery: https://docs.microsoft.com/en-us/azure/data-factory/solution-template-copy-new-files-lastmodifieddate

Once again, thank you for using ADF and happy coding data integration with ADF!



来源:https://stackoverflow.com/questions/50298122/azure-data-factory-incremental-data-load-from-sftp-to-blob

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!