How to change all columns to double type in a spark dataframe

强颜欢笑 提交于 2020-04-12 05:56:49

问题


I am trying to change all the columns of a spark dataframe to double type but i want to know if there is a better way of doing it than just looping over the columns and casting.


回答1:


With this dataframe:

df = spark.createDataFrame(
  [
    (1,2),
    (2,3),
  ],
  ["foo","bar"]
)

df.show()
+---+---+
|foo|bar|
+---+---+
|  1|  2|
|  2|  3|
+---+---+

the for loop is problably the easiest and more natural solution.

from pyspark.sql import functions as F

for col in df.columns:
  df = df.withColumn(
    col,
    F.col(col).cast("double")
  )

df.show()
+---+---+
|foo|bar|
+---+---+
|1.0|2.0|
|2.0|3.0|
+---+---+

Of course, you can also use python comprehension:

df.select(
  *(
    F.col(col).cast("double").alias(col)
    for col
    in df.columns
  )
).show()

+---+---+
|foo|bar|
+---+---+
|1.0|2.0|
|2.0|3.0|
+---+---+

If you have a lot of columns, the second solution is a little bit better.



来源:https://stackoverflow.com/questions/54399188/how-to-change-all-columns-to-double-type-in-a-spark-dataframe

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!