How to subtract vector from scalar in scala?

与世无争的帅哥 提交于 2021-02-11 14:20:00

问题


I have parquet file which contain two columns (id,features).I want to subtract features from scalar and divide output by another scalar. parquet file

df.withColumn("features", ((df("features")-constant1)/constant2))

but give me error

requirement failed: The number of columns doesn't match. Old column names (2): id, features New column names (1): features How to solve it?


回答1:


My scala spark code to this as below . Only way to do any operation on vector sparkm datatype is casting to string. Also used UDF to perform subtraction and division.

import spark.implicits._
import org.apache.spark.ml.linalg.Vectors
import org.apache.spark.sql.functions.udf
import org.apache.spark.sql.functions._
var df = Seq((1, Vectors.dense(35)),
  (2, Vectors.dense(45)),
  (3, Vectors.dense(4.5073)),
  (4, Vectors.dense(56)))
  .toDF("id", "features")
df.show()
val constant1 = 10
val constant2 = 2
val performComputation = (s: Double, val1: Int, val2: Int) => {
  Vectors.dense((s - val1) / val2)
}
val performComputationUDF = udf(performComputation)
df.printSchema()
df = df.withColumn("features",
  regexp_replace(df.col("features").cast("String"),
    "[\\[\\]]", "").cast("Double")
)

df = df.withColumn("features",
  performComputationUDF(df.col("features"),
    lit(constant1), lit(constant2))
)
df.show(20, false)

// Write State should with mode overwrite
df.write
  .mode("overwrite")
  .parquet("file:///usr/local/spark/dataset/output1/")

Result

+---+----------+
|id |features  |
+---+----------+
|1  |[12.5]    |
|2  |[17.5]    |
|3  |[-2.74635]|
|4  |[23.0]    |
+---+----------+


来源:https://stackoverflow.com/questions/58358330/how-to-subtract-vector-from-scalar-in-scala

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!