Convert value depending on a type in SparkSQL via case matching of type

前端 未结 3 1270
眼角桃花
眼角桃花 2021-01-11 23:20

Is it possible to match a parametric type in Scala? Lets say I have a function that receives two parameters: a value and a type. I would like to us

3条回答
  •  遥遥无期
    2021-01-12 00:19

    Could be something specific to the code I'm working on, or perhaps it varies depending on the SQL vendor, but I found that DecimalType doesn't have a single underlying type. Sometimes I get a spark Decimal and other times I get a java BigDecimal. If I try to getAs[Decimal] when it's a BigDecimal, I get an exception. If I try to getAs[BigDecimal] when it's a Decimal, I get an exception.

    To handle this I had to do some more sniffing after matching DecimalType:

      case d: DecimalType =>
        // Oddly a column that matches to DecimalType can be of several different
        // class types and trying to getAs[Decimal] when it's a BigDecimal and/or
        // trying to getAs[BigDecimal] when the column is a Decimal results in an
        // exception, so make the right decision based on the instance class.
        val decimal = row.get(index) match {
          case bigDecimal: java.math.BigDecimal => Decimal(bigDecimal)
          case decimal: Decimal => decimal
          case _ => throw(
            new RuntimeException("Encountered unexpected decimal type")
          )
        }
    

    From there you can do whatever you need to do knowing that decimal is of type Decimal.

提交回复
热议问题