SparkSQL报错:Exception in thread "main" org.apache.spark.sql.catalyst.errors.package$TreeNodeException

匿名 (未验证) 提交于 2019-12-03 00:26:01

错误信息:


18/06/13 13:17:59 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
18/06/13 13:24:08 ERROR KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
18/06/13 13:24:27 WARN Utils: Truncated the string representation of a plan since it was too large. This behavior can be adjusted by setting 'spark.debug.maxToStringFields' in SparkEnv.conf.
Exception in thread "main" org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
Exchange hashpartitioning(caller_no#215, called_no#216, tdr_id#187L, refid#188L, egci#219, ecgi#189, call_identify#190L, direction#191L, starttime#192, starttime_ms#193L, start_time#194, endtime_ms#195L, codec_type#196L, avg_codecrate#197, mos#198, rtcp_jitter#199L, rtcp_total_packet_num#200L, rtcp_loss_packet_num#201L, ipmos#202, rtcp_delay#203L, rtp_jitter#204L, rtp_total_packet_num#205L, rtp_loss_packet_num#206L, rtp_burst_loss_packet_rate#207, ... 62 more fields)
+- *HashAggregate(keys=[caller_no#215, called_no#216, tdr_id#187L, refid#188L, egci#219, ecgi#189, call_identify#190L, direction#191L, starttime#192, starttime_ms#193L, start_time#194, endtime_ms#195L, codec_type#196L, avg_codecrate#197, mos#198, rtcp_jitter#199L, rtcp_total_packet_num#200L, rtcp_loss_packet_num#201L, ipmos#202, rtcp_delay#203L, rtp_jitter#204L, rtp_total_packet_num#205L, rtp_loss_packet_num#206L, rtp_burst_loss_packet_rate#207, ... 61 more fields], functions=[], output=[caller_no#215, called_no#216, tdr_id#187L, refid#188L, egci#219, ecgi#189, call_identify#190L, direction#191L, starttime#192, starttime_ms#193L, start_time#194, endtime_ms#195L, codec_type#196L, avg_codecrate#197, mos#198, rtcp_jitter#199L, rtcp_total_packet_num#200L, rtcp_loss_packet_num#201L, ipmos#202, rtcp_delay#203L, rtp_jitter#204L, rtp_total_packet_num#205L, rtp_loss_packet_num#206L, rtp_burst_loss_packet_rate#207, ... 61 more fields])


























































Caused by: org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
Exchange hashpartitioning(msisdn#1085, 200)
+- *Filter (((isnotnull(MSISDN#1085) && NOT (MSISDN#1085 = )) && isnotnull(TIME#1091)) && NOT (TIME#1091 = ))


















































































Caused by: java.lang.RuntimeException: Expected only partition pruning predicates: (((isnotnull(P_CITY#1076) && isnotnull(P_HOUR#1075)) && (P_HOUR#1075 = 2018061212)) && (P_CITY#1076 = 579))






























原因:



标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!