I have two Hive scripts which look like this:
Script A:
SET hive.exec.dynamic.partition=true; SET hive.exec.dynamic.partition.mode=non-s
You can store these configuration parameters in common file and load in each of your scripts using source
command:
source /tmp/common_init.hql;
Also you can generate this file for each workflow from the database.
You should be able to use hive -i config.hql -f script_A.hql
, where config.hql
would contain your dynamic settings. The -i
flag allows you to pass an initialization script that will be executed before the actual hive file passed to -f
. I'm not super familiar with how AWS kicks off hive jobs in steps, but presumably you edit the submission arguments.