Method showString([class java.lang.Integer, class java.lang.Integer, class java.lang.Boolean]) does not exist in PySpark

前端 未结 2 757
青春惊慌失措
青春惊慌失措 2020-12-07 03:33

This is the snippet:

from pyspark import SparkContext
from pyspark.sql.session import SparkSession

sc = SparkContext(         


        
相关标签:
2条回答
  • 2020-12-07 03:40

    On spark-shell console, enter the variable name and see the data type. As an alternative, you can tab twice after variable named. and it will show necessary function which could be applied. Example of a DataFrame object.

    res23: org.apache.spark.sql.DataFrame = [order_id: string, book_name: string ... 1 more field]
    
    0 讨论(0)
  • 2020-12-07 03:57

    This is an indicator of a Spark version mismatch. Before Spark 2.3 show method took only two arguments:

    def show(self, n=20, truncate=True):
    

    since 2.3 it takes three arguments:

    def show(self, n=20, truncate=True, vertical=False):
    

    In your case Python client seems to invoke the latter one, while the JVM backend uses the older version.

    Since SparkContext initialization undergone significant changes in 2.4, which would cause failure on SparkContext.__init__, you're likely using:

    • 2.3.x Python library.
    • 2.2.x JARs.

    You can confirm that by checking versions directly from your session, Python:

    sc.version
    

    vs. JVM:

    sc._jsc.version()
    

    Problems like this, are usually a result of misconfigured PYTHONPATH (either directly, or by using pip installed PySpark on top per-existing Spark binaries) or SPARK_HOME.

    0 讨论(0)
提交回复
热议问题