Error logging in python not working with azure databricks

左心房为你撑大大i 提交于 2021-02-10 05:37:11

问题


Question related to this problem was not answered by anyone

I tried implementing error logging using python in azure data bricks. If i try the below code in python(pycharm) it is working as expected. But when i try the same code in azure databricks(python) it is not creating a file and not writing any contents into the file. I tried creating a file in azure data lake gen2. i have given the path with mount point of data lake store gen2.

Can you please help why the python code is not working as expected in azure data bricks(python)

# importing module
import logging

dbutils.fs.mkdirs('/dbfs/mnt/sales/region/country/sample/newfile.txt')

# Create and configure logger
logging.basicConfig(filename="/dbfs/mnt/sales/region/
                   country/sample/newfile.txt",
                          format='%(asctime)s %(message)s',
                          filemode='a')

# Creating an object
logger = logging.getLogger()

# Setting the threshold of logger to DEBUG
logger.setLevel(logging.DEBUG)

# Test messages
logger.debug("Harmless debug Message")
logger.info("Just an information")
logger.warning("Its a Warning")
logger.error("Did you try to divide by zero")
logger.critical("Internet is down")


If i open the file i expect the output to be like below which is 
happening with python but the same is not working with azure data 
bricks(python)

2019-06-06 00:19:23,881 Harmless debug Message
2019-06-06 00:19:23,881 Just an information
2019-06-06 00:19:23,881 Its a Warning
2019-06-06 00:19:23,881 Did you try to divide by zero
2019-06-06 00:19:23,881 Internet is down
2019-06-06 00:19:33,447 Harmless debug Message
2019-06-06 00:19:33,447 Just an information
2019-06-06 00:19:33,447 Its a Warning
2019-06-06 00:19:33,447 Did you try to divide by zero
2019-06-06 00:19:33,447 Internet is down

回答1:


In Databricks, you have mounted a Blob storage container (or ADLS Gen2 file system) at the path /dbfs/mnt/sales. You cannot make random writes to files backed by blob storage and the Python logging library is just failing silently.

https://docs.databricks.com/data/databricks-file-system.html#local-file-api-limitations

To test this:

# this works
with open('/dbfs/mnt/container-name/my-app.log', 'w') as fid:
    fid.write('this is a message')

# this fails, since append to existing file is a random write operation
with open('/dbfs/mnt/container-name/my-app.log', 'a') as fid:
    fid.write('this message will not work')


来源:https://stackoverflow.com/questions/56466855/error-logging-in-python-not-working-with-azure-databricks

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!