Functions from custom module not working in PySpark, but they work when inputted in interactive mode
问题 I have a module that I've written containing functions that act on PySpark DataFrames. They do a transformation on columns in the DataFrame and then return a new DataFrame. Here is an example of the code, shortened to include only one of the functions: from pyspark.sql import functions as F from pyspark.sql import types as t import pandas as pd import numpy as np metadta=pd.DataFrame(pd.read_csv("metadata.csv")) # this contains metadata on my dataset def str2num(text): if type(text)==None or