How to install libraries to python in zeppelin-spark2 in HDP

懵懂的女人 提交于 2020-01-01 07:27:31

问题


I am using HDP Version: 2.6.4

Can you provide a step by step instructions on how to install libraries to the following python directory under spark2 ?

The sc.version (spark version) returns

res0: String = 2.2.0.2.6.4.0-91

The spark2 interpreter name and value is as following

zeppelin.pyspark.python:    /usr/local/Python-3.4.8/bin/python3.4

The python version and current libraries are

%spark2.pyspark

import pip
import sys

sorted(["%s==%s" % (i.key, i.version) for i in pip.get_installed_distributions()])

print("--")     
print (sys.version)
print("--")
print(installed_packages_list)

--
3.4.8 (default, May 30 2018, 11:05:04) 
[GCC 4.4.7 20120313 (Red Hat 4.4.7-18)]
--
['pip==9.0.1', 'setuptools==28.8.0']

Update 1: using pip install [package name] actually leads to two problems

1) The HDP is pointing at python2.6 rather than python3.4.8

2) pip3 is not there for some reason

Therefore, I am thinking of installing miniconda and pointing Zeppelin there and installing all the packages in conda to prevent conflict between python 2.6 and 3.4.8


回答1:


You need to open your terminal and type pip and press the TAB key. The pip versions available on your sandbox shall be listed. Use pip3 to install the packages you require. The way to do so remains the same pip3 install "packageName". This would make the package available with the Python3 installation you wish to use in Zeppelin.



来源:https://stackoverflow.com/questions/50603891/how-to-install-libraries-to-python-in-zeppelin-spark2-in-hdp

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!