Databricks and Azure Files

与世无争的帅哥 提交于 2019-12-24 06:41:22

问题


I need to access Azure Files from Azure Databricks. According to the documentation Azure Blobs are supported but I am need this code to work with Azure files:

dbutils.fs.mount(
  source = "wasbs://<your-container-name>@<your-storage-account-name>.file.core.windows.net",
  mount_point = "/mnt/<mount-name>",
  extra_configs = {"<conf-key>":dbutils.secrets.get(scope = "<scope-name>", key = "<key-name>")})

or is there another way to mount/access Azure Files to/from a Azure Databricks cluster? Thanks


回答1:


On Azure, generally you can mount a file share of Azure Files to Linux via SMB protocol. And I tried to follow the offical tutorial Use Azure Files with Linux to do it via create a notebook in Python to do the commands as below, but failed.

It seems that Azure Databricks does not allow to do that, even I searched about mount NFS, SMB, Samba, etc. in Databricks community that there is not any discussion.

So the only way to access files in Azure Files is to install the azure-storage package and directly to use Azure Files SDK for Python on Azure Databricks.



来源:https://stackoverflow.com/questions/55617970/databricks-and-azure-files

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!