Spark assign value if null to column (python)

匆匆过客 提交于 2020-06-13 06:05:07

问题


Assuming that I have the following data

+--------------------+-----+--------------------+
|              values|count|             values2|
+--------------------+-----+--------------------+
|              aaaaaa|  249|                null|
|              bbbbbb|  166|                  b2|
|              cccccc| 1680|           something|
+--------------------+-----+--------------------+

So if there is a null value in values2 column how to assign the values1 column to it? So the result should be:

+--------------------+-----+--------------------+
|              values|count|             values2|
+--------------------+-----+--------------------+
|              aaaaaa|  249|              aaaaaa|
|              bbbbbb|  166|                  b2|
|              cccccc| 1680|           something|
+--------------------+-----+--------------------+

I thought of something of the following but it doesnt work:

df.na.fill({"values2":df['values']}).show()

I found this way to solve it but there should be something more clear forward:

def change_null_values(a,b):
    if b:
        return b
    else:
        return a

udf_change_null = udf(change_null_values,StringType())

df.withColumn("values2",udf_change_null("values","values2")).show()

回答1:


You can use https://spark.apache.org/docs/1.6.2/api/python/pyspark.sql.html#pyspark.sql.functions.coalesce

df.withColumn('values2', coalesce(df.values2, df.values)).show()



回答2:


You can use the column attribute .isNull().

df.where(col("dt_mvmt").isNull())

df.where(col("dt_mvmt").isNotNull())

This answer comes from this answer - I just don't have enough reputation to add a comment.



来源:https://stackoverflow.com/questions/39344250/spark-assign-value-if-null-to-column-python

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!