I am able to create a UDF function and register to spark using spark.UDF method. However, this is per session only. How to register python UDF functions automatically when
acutally you can create a permanent function but not from a notebook you need to create it from a JAR file
https://docs.databricks.com/spark/latest/spark-sql/language-manual/create-function.html
CREATE [TEMPORARY] FUNCTION [db_name.]function_name AS class_name [USING resource, ...]
resource: : (JAR|FILE|ARCHIVE) file_uri