How to run script in Pyspark and drop into IPython shell when done?

喜欢而已 提交于 2019-12-22 05:53:19

问题


I want to run a spark script and drop into an IPython shell to interactively examine data.

Running both:

$ IPYTHON=1 pyspark --master local[2] myscript.py

and

$ IPYTHON=1 spark-submit --master local[2] myscript.py

both exit out of IPython once done.

This seems really simple, but can't find how to do it anywhere.


回答1:


If you launch the iPython shell with:

$ IPYTHON=1 pyspark --master local[2]

you can do:

 >>> %run myscript.py

and all variables will stay in the workspace. You can also debug step by step with:

>>> %run -d myscript.py



回答2:


Launch the IPython shell using IPYTHON=1 pyspark, then run execfile('/path/to/myscript.py'), that should run your script inside the shell and return back to it.



来源:https://stackoverflow.com/questions/25934778/how-to-run-script-in-pyspark-and-drop-into-ipython-shell-when-done

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!