does sqoop support dynamic partitioning with hive?

白昼怎懂夜的黑 提交于 2019-12-25 01:36:13

问题


does sqoop support dynamic partitioning with hive? i tried using below-mentioned options : --hive-partition-key and --hive=partition-vlaue is only for static partitioning

for eg: sqoop import --connect "jdbc:mysql://quickstart.cloudera:3306/prac" --username root --password cloudera --hive-import --query "select id,name,ts from student where city='Mumbai' and \$CONDITIONS " --hive-partition-key city --hive-partition-value 'Mumbai' --hive-table prac.student --target-dir /user/mangesh/sqoop_import/student_temp5 --split-by id


回答1:


You can use hcatalog imports - you need sqoop 1.4.4 to do this.

sqoop import \
  --connect "jdbc:oracle:SERVERDETAILS" \
  --username <User Name>\
  --password <Password> \
  --table <Databse.Tablename> \
  --fields-terminated-by ',' \
  --split-by <Column Name> \
 --hcatalog-database <Hive DatabseName>
--hcatalog-table <Hive Table Name>

For more details please go through

http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html#_sqoop_hcatalog_integration



来源:https://stackoverflow.com/questions/46263147/does-sqoop-support-dynamic-partitioning-with-hive

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!