Submit a Python project to Dataproc job
问题 I have a python project, whose folder has the structure main_directory - lib - lib.py - run - script.py script.py is from lib.lib import add_two spark = SparkSession \ .builder \ .master('yarn') \ .appName('script') \ .getOrCreate() print(add_two(1,2)) and lib.py is def add_two(x,y): return x+y I want to launch as a Dataproc job in GCP. I have checked online, but I have not understood well how to do it. I am trying to launch the script with gcloud dataproc jobs submit pyspark --cluster=