multiple conditions for filter in spark data frames

前端 未结 11 1151
醉酒成梦
醉酒成梦 2020-12-03 04:41

I have a data frame with four fields. one of the field name is Status and i am trying to use a OR condition in .filter for a dataframe . I tried below queries but no luck.

11条回答
  •  长情又很酷
    2020-12-03 05:15

    Another way is to use function expr with where clause

    import org.apache.spark.sql.functions.expr
    
    df2 = df1.where(expr("col1 = 'value1' and col2 = 'value2'"))
    

    It works the same.

提交回复
热议问题