How to “negative select” columns in spark's dataframe

前端 未结 9 2011
野的像风
野的像风 2020-12-15 05:35

I can\'t figure it out, but guess it\'s simple. I have a spark dataframe df. This df has columns \"A\",\"B\" and \"C\". Now let\'s say I have an Array containing the name of

9条回答
  •  一整个雨季
    2020-12-15 05:40

    You were almost there: just map the filtered array to col and unpack the list using : _*:

    df.select(column_names.filter(_!="B").map(col): _*)
    

提交回复
热议问题