After installing sparknlp, cannot import sparknlp

纵然是瞬间 提交于 2019-12-06 13:02:26

I figured it out. The jar files that were correctly loaded were only the compiled Scala files. I still had to put the Python files that contained the wrapper code in a location that I could import from. Once I did that, everything worked great.

You can use the SparkNLP package in PySpark using the command:

pyspark --packages JohnSnowLabs:spark-nlp:1.3.0

But this doesn't tell Python where to find the bindings. Following the instructions for a similar report here, this can be fixed either by adding the jar directory to your PYTHONPATH:

export PYTHONPATH="~/.ivy2/jars/JohnSnowLabs_spark-nlp-1.3.0.jar:$PYTHONPATH"

or by

import sys, glob, os
sys.path.extend(glob.glob(os.path.join(os.path.expanduser("~"), ".ivy2/jars/*.jar")))
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!