Apply same function to all fields of spark dataframe row

前端 未结 2 1448
礼貌的吻别
礼貌的吻别 2020-12-09 11:16

I have dataframe in which I have about 1000s ( variable) columns.

I want to make all values upper case.

Here is the approach I have thought of , can you sug

2条回答
  •  挽巷
    挽巷 (楼主)
    2020-12-09 12:10

    If you simply want to apply the same functions to all columns something like this should be enough:

    import org.apache.spark.sql.functions.{col, upper}
    
    val df = sc.parallelize(
      Seq(("a", "B", "c"), ("D", "e", "F"))).toDF("x", "y", "z")
    df.select(df.columns.map(c => upper(col(c)).alias(c)): _*).show
    
    // +---+---+---+
    // |  x|  y|  z|
    // +---+---+---+
    // |  A|  B|  C|
    // |  D|  E|  F|
    // +---+---+---+
    

    or in Python

    from pyspark.sql.functions import col, upper
    
    df = sc.parallelize([("a", "B", "c"), ("D", "e", "F")]).toDF(("x", "y", "z"))
    df.select(*(upper(col(c)).alias(c) for c in df.columns)).show()
    
    ##  +---+---+---+
    ##  |  x|  y|  z|
    ##  +---+---+---+
    ##  |  A|  B|  C|
    ##  |  D|  E|  F|
    ##  +---+---+---+
    

    See also: SparkSQL: apply aggregate functions to a list of column

提交回复
热议问题