How to use spark with large decimal numbers?

不打扰是莪最后的温柔 提交于 2021-02-19 03:51:46

问题


My database has numeric value, which is up to 256-bit unsigned integer. However, spark's decimalType has a limit of Decimal(38,18).

When I try to do calculations on the column, exceptions are thrown.

java.lang.IllegalArgumentException: requirement failed: Decimal precision 39 exceeds max precision 38".

Is there any third-party library or workarounds that solve this issue? Or Spark is designed for numbers smaller than Decimal(38,18)?

来源:https://stackoverflow.com/questions/53074721/how-to-use-spark-with-large-decimal-numbers

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!