Select spark dataframe column with special character in it using selectExpr

夙愿已清 提交于 2021-01-01 04:29:11

问题


I am in a scenario where my columns name is Município with accent on the letter í.

My selectExpr command is failing because of it. Is there a way to fix it? Basically I have something like the following expression:

.selectExpr("...CAST (Município as string) as Município...")

What I really want is to be able to leave the column with the same name that it came, so in the future, I won't have this kind of problem on different tables/files.

How can I make spark dataframe accept accents or other special characters?


回答1:


You can use wrap your column name in backticks. For example, if you had the following schema:

df.printSchema()
#root
# |-- Município: long (nullable = true)

Express the column name with the special character wrapped with the backtick:

df2 = df.selectExpr("CAST (`Município` as string) as `Município`")
df2.printSchema()
#root
# |-- Município: string (nullable = true)


来源:https://stackoverflow.com/questions/57963605/select-spark-dataframe-column-with-special-character-in-it-using-selectexpr

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!