Azure data factory | incremental data load from SFTP to Blob

冷暖自知 提交于 2019-11-30 16:38:18

Thanks for using Data Factory!

To incrementally load newly generated files on SFTP server, you can leverage the GetMetadata activity to retrieve the LastModifiedDate property: https://docs.microsoft.com/en-us/azure/data-factory/control-flow-get-metadata-activity

Essentially you author a pipeline containing the following activities:

  • getMetadata (return list of files under a given folder)
  • ForEach (iterate through each file)
  • getMetadata (return lastModifiedTime for a given file)
  • IfCondition (compare lastModifiedTime with trigger WindowStartTime)
  • Copy (copy file from source to destination)

Have fun building data integration flows using Data Factory!

since I posted my previous answer in May last year, many of you contacted me asking for pipeline sample to achieve the incremental file copy scenario using the getMetadata-ForEach-getMetadata-If-Copy pattern. This has been important feedback that incremental file copy is a common scenario that we want to further optimize.

Today I would like to post an updated answer - we recently released a new feature that allows a much easier and scalability approach to achieve the same goal:

You can now set modifiedDatetimeStart and modifiedDatetimeEnd on SFTP dataset to specify the time range filters to only extract files that were created/modified during that period. This enables you to achieve the incremental file copy using a single activity: https://docs.microsoft.com/en-us/azure/data-factory/connector-sftp#dataset-properties

This feature is enabled for these file-based connectors in ADF: AWS S3, Azure Blob Storage, FTP, SFTP, ADLS Gen1, ADLS Gen2, and on-prem file system. Support for HDFS is coming very soon.

Further, to make it even easier to author an incremental copy pipeline, we now release common pipeline patterns as solution templates. You can select one of the templates, fill out the linked service and dataset info, and click deploy – it is that simple! https://docs.microsoft.com/en-us/azure/data-factory/solution-templates-introduction

You should be able to find the incremental file copy solution in the gallery: https://docs.microsoft.com/en-us/azure/data-factory/solution-template-copy-new-files-lastmodifieddate

Once again, thank you for using ADF and happy coding data integration with ADF!

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!