How to increase the default precision and scale while loading data from oracle using spark-sql

懵懂的女人 提交于 2019-12-02 11:16:06

问题


Trying to load a data from oracle table where I have few columns hold floating point values , some times it holds upto DecimalType(40,20) i.e. 20 digits after point. Currently when I load its columns using

var local_ora_df: DataFrameReader = ora_df;
      local_ora_df.option("partitionColumn", "FISCAL_YEAR")  
       local_ora_df
          .option("schema",schema)
          .option("dbtable", query)
          .load()

It is holding 10 digits after point i.e. decimal(38,10) (nullable = true) If I want to increase digits after point while reading from oracle using spark-sql what should I do ?


回答1:


We can use .option("customSchema", "data DECIMAL(38, 15)) to increase it to 15 digits after point.



来源:https://stackoverflow.com/questions/54780779/how-to-increase-the-default-precision-and-scale-while-loading-data-from-oracle-u

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!