How to bind variable in Apache Spark SQL? For example:
val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
sqlContext.sql(\"SELECT * FROM src WHE
I verified it in both Spark shell 2.x shell and Thrift(beeline) as well. I could able to bind a variable in Spark SQL query with set command.
Query without bind variable:
select count(1) from mytable;
Query with bind variable (parameterized):
1. Spark SQL shell
set key_tbl=mytable; -- setting mytable to key_tbl to use as ${key_tbl} select count(1) from ${key_tbl};2. Spark shell
spark.sql("set key_tbl=mytable") spark.sql("select count(1) from ${key_tbl}").collect()
Both w/w.o bind params the query returns an identical result.
Note: Don't give any quotes to the value of key as it's table name here.
Let me know if there are any questions.