Databricks SQL Server connection across multiple notebooks

不羁岁月 提交于 2020-06-17 09:45:14

问题


I found some resources for how to pass variables across pySpark databricks notebooks. I'm curious if we can pass SQL Server connection, such as having host/database/port/user/pw in Notebook A and calling the connection on Notebook B.


回答1:


Take a look at that part of Databricks documentation: https://docs.databricks.com/notebooks/notebook-workflows.html#pass-structured-data. This way you can pass strings, one or multiple, across notebooks, but you'll have to create the connection in Notebook B manually.

Other option - create Notebook A, that creates a connection variable, and "run" it before executing some code in Notebook B (more details here - https://forums.databricks.com/questions/154/can-i-run-one-notebook-from-another-notebook.html). Basically, you need a cell with code:

%run path/to/notebookA


来源:https://stackoverflow.com/questions/61897657/databricks-sql-server-connection-across-multiple-notebooks

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!