Spark error - Decimal precision 39 exceeds max precision 38 (I know its duplicate but have no solution)

后端 未结 0 1259
Happy的楠姐
Happy的楠姐 2021-01-23 00:08

When trying to load data from JDBC(Oracle) into Spark, there seems to be precision loss in the decimal field, as per my understanding Spark supports DECIMAL(38,18). The field fr

相关标签:
回答
  • 消灭零回复
提交回复
热议问题