How to refer a map column in a spark-sql query?
问题 scala> val map1 = spark.sql("select map('p1', 's1', 'p2', 's2')") map1: org.apache.spark.sql.DataFrame = [map(p1, s1, p2, s2): map<string,string>] scala> map1.show() +--------------------+ | map(p1, s1, p2, s2)| +--------------------+ |[p1 -> s1, p2 -> s2]| +--------------------+ scala> spark.sql("select element_at(map1, 'p1')") org.apache.spark.sql.AnalysisException: cannot resolve ' map1 ' given input columns: []; line 1 pos 18; 'Project [unresolvedalias('element_at('map1, p1), None)] How