Sqoop Import from Hive to Hive

扶醉桌前 提交于 2019-12-23 04:54:37

问题


Can we import tables from Hive DataSource to Hive DataSource using Sqoop.

Query like -

sqoop import --connect jdbc:hive2://localhost:10000/default --driver org.apache.hive.jdbc.HiveDriver --username root --password root --table student1 -m 1 --target-dir hdfs://localhost:9000/user/dummy/hive2result

Right now its throwing the below exception

15/07/19 19:50:18 ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Method not supported
java.sql.SQLException: Method not supported
    at org.apache.hive.jdbc.HiveResultSetMetaData.isSigned(HiveResultSetMetaData.java:141)
    at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:290)
    at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:240)
    at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:226)
    at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)
    at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1773)
    at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1578)
    at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:96)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:601)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:236)

回答1:


Sqoop is not a tool for transferring data from one hive instance to another hive instance. Seems like your requirement is to transfer data in hive from one cluster to another cluster. This can be achieved using hadoop distcp. The full form of sqoop itself is SQl to hadOOP and viceversa.

If you want to migrate multiple databases and tables from one hive to another hive instance, the best approach is to transfer the data using hadoop distcp and trigger the DDLs in the 2nd hive instance. If you don't have the DDLs handy with you, no need to worry. Just take a dump of the metastore database. Open the dump file using a notepad or textpad Replace the hdfs uri with the new hdfs uri. Import the mysql dump to the metastore of the 2nd hive instance. Refresh the tables. An example is given in the below blog post https://amalgjose.wordpress.com/2013/10/11/migrating-hive-from-one-hadoop-cluster-to-another-cluster-2/



来源:https://stackoverflow.com/questions/31502364/sqoop-import-from-hive-to-hive

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!