spark-packages

After installing sparknlp, cannot import sparknlp

[亡魂溺海] 提交于 2019-12-10 11:33:20
问题 The following ran successfully on a Cloudera CDSW cluster gateway. import pyspark from pyspark.sql import SparkSession spark = (SparkSession .builder .config("spark.jars.packages","JohnSnowLabs:spark-nlp:1.2.3") .getOrCreate() ) Which produces this output. Ivy Default Cache set to: /home/cdsw/.ivy2/cache The jars for the packages stored in: /home/cdsw/.ivy2/jars :: loading settings :: url = jar:file:/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/jars/ivy-2.4.0

After installing sparknlp, cannot import sparknlp

纵然是瞬间 提交于 2019-12-06 13:02:26
The following ran successfully on a Cloudera CDSW cluster gateway. import pyspark from pyspark.sql import SparkSession spark = (SparkSession .builder .config("spark.jars.packages","JohnSnowLabs:spark-nlp:1.2.3") .getOrCreate() ) Which produces this output. Ivy Default Cache set to: /home/cdsw/.ivy2/cache The jars for the packages stored in: /home/cdsw/.ivy2/jars :: loading settings :: url = jar:file:/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml JohnSnowLabs#spark-nlp added as a dependency ::