Spark: “Truncated the string representation of a plan since it was too large.” Warning when using manually created aggregation expression

后端 未结 2 1144
-上瘾入骨i
-上瘾入骨i 2020-12-23 10:00

I am trying to build for each of my users a vector containing the average number of records per hour of day. Hence the vector has to have 24 dimensions.

My original

2条回答
  •  陌清茗
    陌清茗 (楼主)
    2020-12-23 10:20

    This config, along many others, has been moved to: SQLConf - sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala

    This can be set either in the config file or via command line in spark, using:

    spark.conf.set("spark.sql.debug.maxToStringFields", 1000)
    

提交回复
热议问题