How to specify subquery in the option “dbtable” in Spark-jdbc application while reading data from a table on Greenplum? [duplicate]
问题 This question already has an answer here : How to give table name in spark-jdbc application for reading data on an RDBMS database? (1 answer) Closed 10 months ago . I am trying to read data from a table on Greenplum into HDFS using Spark. I gave a subquery in options to read the greenplum table as below. val execQuery = s"(select ${allColumns}, 0 as ${flagCol} from dbanscience.xx_lines where year=2017 and month=12) as xx_lines_periodYear" println("ExecQuery: " + execQuery) val dataDF = spark