ValueError: Cannot convert column into bool

扶醉桌前 提交于 2019-12-05 11:19:22

Either use udf:

from pyspark.sql.functions import udf

@udf("integer")
def calc_dif(x,y):
    if (x>y) and (x==1):
        return x-y

or case when (recommended)

from pyspark.sql.functions import when

def calc_dif(x,y):
    when(( x > y) & (x == 1), x - y)

The first one computes on Python objects, the second one on Spark Columns

It is complaining because you give your calc_dif function the whole column objects, not the actual data of the respective rows. You need to use a udf to wrap your calc_dif function :

from pyspark.sql.types import IntegerType
from pyspark.sql.functions import udf

l = [(2, 1), (1,1)]
df = spark.createDataFrame(l)

def calc_dif(x,y):
    # using the udf the calc_dif is called for every row in the dataframe
    # x and y are the values of the two columns 
    if (x>y) and (x==1):
        return x-y

udf_calc = udf(calc_dif, IntegerType())

dfNew = df.withColumn("calc", udf_calc("_1", "_2"))
dfNew.show()

# since x < y calc_dif returns None
+---+---+----+
| _1| _2|calc|
+---+---+----+
|  2|  1|null|
|  1|  1|null|
+---+---+----+

For anyone who has a similar error: I was trying to pass an rdd when I needed a Pandas object and got the same error. Obviously, I could simply solve it by a ".toPandas()"

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!