How to limit the resources on PySpark cluster for Jupyter Notebook?

前端 未结 0 1924
小鲜肉
小鲜肉 2020-12-18 16:35

I want to restrict the usage of resources for my PySpark code running on Jupyter Notebook. I tried

%%configure -f {\'driverMemory\': \'1000M\', "executor         


        
相关标签:
回答
  • 消灭零回复
提交回复
热议问题