For example I have few Hive HQL statements which I want to pass into Spark SQL:
set parquet.compression=SNAPPY;
creat
I worked on a scenario where i needed to read a sql file and run all the; separated queries present in that file.
One simple way to do it is like this:
val hsc = new org.apache.spark.sql.hive.HiveContext(sc)
val sql_file = "/hdfs/path/to/file.sql"
val file = sc.wholeTextFiles(s"$sql_file")
val queries = f.take(1)(0)._2
Predef.refArrayOps(queries.split(';')).map(query => hsc.sql(query))