How to execute .sql file in spark using python
问题 from pyspark import SparkConf, SparkContext from pyspark.sql import SQLContext conf = SparkConf().setAppName("Test").set("spark.driver.memory", "1g") sc = SparkContext(conf = conf) sqlContext = SQLContext(sc) results = sqlContext.sql("/home/ubuntu/workload/queryXX.sql") When I execute this command using: python test.py it gives me an error . y4j.protocol.Py4JJavaError: An error occurred while calling o20.sql. : java.lang.RuntimeException: [1.1] failure: ``with'' expected but `/' found /home