问题
I am trying to export data to mysq
l from hdfs
through sqoop
. I am able to run sqoop through shell and it is working fine . but when I am invoking through oozie
. it is arising following error and getting fail. I have also included jars. there is no desciptive log
sqoop script:
export --connect jdbc:mysql://localhost/bigdata --username root --password cloudera --verbose --table AGGREGATED_METRICS --input-fields-terminated-by '\0001' --export-dir /bigdata/aggregated_metrics
error:
Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
回答1:
The erros you see in Oozie usually do not provide much details. To get more info about what went wrong you can take the job_id from the Oozie action and search for it in the JobTracker logs. There you will find a more detailed description.
回答2:
I believe that you need to install Sqoop jars with all dependencies into Oozie (either to shared directories or to your particular workflow).
回答3:
I had this same problem. This problem went aways when I added the mysql-connector-java.jar library into the lib directory located inside the oozie project root directory where the job.properties and workflow.xml files are located.
回答4:
Yes, Adding mysql-connector-java-*.jar to workflow lib directory solves the problem. But its hard to copy the jar for each sqoop job.
Adding the mysql-connector-java-*.jar once to share/lib/sqoop directory in HDFS is better.
回答5:
The error message is not detail enough here. One another thing is, in the workflow sqoop action, could try to use
\0001
instead of
'\0001'
来源:https://stackoverflow.com/questions/11555344/sqoop-export-fail-through-oozie