what's SparkSQL SQL query to write into JDBC table?

房东的猫 提交于 2019-12-10 06:25:03

问题


For SQL query in Spark.

For read, we can read jdbc by

CREATE TEMPORARY TABLE jdbcTable
USING org.apache.spark.sql.jdbc
OPTIONS dbtable ...;

For write, what is the query to write the data to the remote JDBC table using SQL?

NOTE: I want it to be SQL query. plz provide the pure "SQL query" that can write to jdbc when using HiveContext.sql(...) of SparkSQL.


回答1:


You can write the dataframe with jdbc similar to follows.

df.write.jdbc(url, "TEST.BASICCREATETEST", new Properties)



回答2:


An INSERT OVERWRITE TABLE will write to your database using the JDBC connection:

DROP TABLE IF EXISTS jdbcTemp;
CREATE TABLE jdbcTemp
USING org.apache.spark.sql.jdbc
OPTIONS (...);

INSERT OVERWRITE TABLE jdbcTemp
SELECT * FROM my_spark_data;
DROP TABLE jdbcTemp;



回答3:


Yes, you can. If you want to save a dataframe into an existing table you can use

df.insertIntoJDBC(url, table, overwrite)

and if you want to create new table to save this dataframe, the you can use

df.createJDBCTable(url, table, allowExisting)



回答4:


// sc is an existing SparkContext.
val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)

sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)")
sqlContext.sql("LOAD DATA LOCAL INPATH 'examples/src/main/resources/kv1.txt' INTO TABLE src")


来源:https://stackoverflow.com/questions/36169319/whats-sparksql-sql-query-to-write-into-jdbc-table

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!