How to “negative select” columns in spark's dataframe

前端 未结 9 2035
野的像风
野的像风 2020-12-15 05:35

I can\'t figure it out, but guess it\'s simple. I have a spark dataframe df. This df has columns \"A\",\"B\" and \"C\". Now let\'s say I have an Array containing the name of

9条回答
  •  青春惊慌失措
    2020-12-15 05:54

    In pyspark you can do

    df.select(list(set(df.columns) - set(["B"])))
    

    Using more than one line you can also do

    cols = df.columns
    cols.remove("B")
    df.select(cols)
    

提交回复
热议问题