dynamically bind variable/parameter in Spark SQL?

后端 未结 4 829
情深已故
情深已故 2020-12-31 05:38

How to bind variable in Apache Spark SQL? For example:

val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
sqlContext.sql(\"SELECT * FROM src WHE         


        
4条回答
  •  滥情空心
    2020-12-31 06:15

    Spark SQL (as of 1.6 release) does not support bind variables.

    ps. What Ashrith is suggesting is not a bind variable.. You're constructing a string every time. Every time Spark will parse the query, create execution plan etc. Purpose of bind variables (in RDBMS systems for example) is to cut time on creating execution plan (which can be costly where there are a lot of joins etc). Spark has to have a special API to "parse" a query and then to "bind" variables. Spark does not have this functionality (as of today, Spark 1.6 release).

    Update 8/2018: as of Spark 2.3 there are (still) no bind variables in Spark.

提交回复
热议问题