Process Azure Datalake store file using Azure function

前提是你 提交于 2021-01-28 01:40:22

问题


I am getting files in a particular folder on my Azure Datalake store at regular interval. As soon as file come, I want to process it further using an Azure function. Is that possible?


回答1:


UPDATE: With Multi-Protocol Access for Azure Data Lake Storage, the storage extension should indeed work and some basic tests do confirm that.

There are open issues [1, 2] for official confirmation of support.


Though Azure Data Lake Storage (ADLS) Gen2 is built upon Azure Blob Storage, there are a couple of known issues and differences which are documented.

Because of these differences, I believe we can't use the existing bindings available for Blob storage or Event Grid.

But you could still have a Function, triggered by Timer, for example and use the ADLS v2 REST API to read/update files.

Also, depending on your use case of course, you might really want to look into the other integrations that ADLS v2 supports, namely - HDInsight, Databricks, SQL Data Warehouse.



来源:https://stackoverflow.com/questions/55874266/process-azure-datalake-store-file-using-azure-function

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!