Custom SQL using Spark Big Query Connector

那年仲夏 提交于 2020-12-15 05:32:46

问题


I have some custom sql to read the data from BigQuery. How can I execute that? I tried using option as query but it is not working. It is ignoring the query option and reading the full table.

 Dataset<Row> testDS = session.read().format("bigquery")
                    //.option("table", <TABLE>)
                    .option("query",<QUERY>)
                    .option("project", <PROJECT_ID>)
                    .option("parentProject", <PROJECT_ID>)
                    .load();

回答1:


That's because the query option is not available in the connector. See https://github.com/GoogleCloudDataproc/spark-bigquery-connector/README.md for a full list of options.

There are couple of options you have:

  • Create a view with your custom SQL, and read from the view
  • Create a temporary table with the results of the query, read those and then delete the table.


来源:https://stackoverflow.com/questions/64985050/custom-sql-using-spark-big-query-connector

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!