问题
does sqoop support dynamic partitioning with hive? i tried using below-mentioned options : --hive-partition-key and --hive=partition-vlaue is only for static partitioning
for eg: sqoop import --connect "jdbc:mysql://quickstart.cloudera:3306/prac" --username root --password cloudera --hive-import --query "select id,name,ts from student where city='Mumbai' and \$CONDITIONS " --hive-partition-key city --hive-partition-value 'Mumbai' --hive-table prac.student --target-dir /user/mangesh/sqoop_import/student_temp5 --split-by id
回答1:
You can use hcatalog imports - you need sqoop 1.4.4 to do this.
sqoop import \
--connect "jdbc:oracle:SERVERDETAILS" \
--username <User Name>\
--password <Password> \
--table <Databse.Tablename> \
--fields-terminated-by ',' \
--split-by <Column Name> \
--hcatalog-database <Hive DatabseName>
--hcatalog-table <Hive Table Name>
For more details please go through
http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html#_sqoop_hcatalog_integration
来源:https://stackoverflow.com/questions/46263147/does-sqoop-support-dynamic-partitioning-with-hive