How to remove nulls with array_remove Spark SQL Built-in Function

后端 未结 3 1370
攒了一身酷
攒了一身酷 2020-12-16 17:33

Spark 2.4 introduced new useful Spark SQL functions involving arrays but I was a little bit puzzled when I find out that the result of: select array_remove(array(1, 2,

3条回答
  •  长情又很酷
    2020-12-16 18:06

    You can do something like this in Spark 2:

    import org.apache.spark.sql.functions._
    import org.apache.spark.sql._
    
    /**
      * Array without nulls
      * For complex types, you are responsible for passing in a nullPlaceholder of the same type as elements in the array
      */
    def non_null_array(columns: Seq[Column], nullPlaceholder: Any = "רכוב כל יום"): Column =
      array_remove(array(columns.map(c => coalesce(c, lit(nullPlaceholder))): _*), nullPlaceholder)
    

    In Spark 3, there is new array filter function and you can do:

    df.select(filter(col("array_column"), x => x.isNotNull))
    

提交回复
热议问题