Use airflow hive operator and output to a text file

一世执手 提交于 2020-01-14 22:26:02

问题


Hi I want to execute hive query using airflow hive operator and output the result to a file. I don't want to use INSERT OVERWRITE here.

hive_ex = HiveOperator(
    task_id='hive-ex',
    hql='/sql/hive-ex.sql',
    hiveconfs={
        'DAY': '{{ ds }}',
        'YESTERDAY': '{{ yesterday_ds }}',
        'OUTPUT': '{{ file_path }}'+'csv',
    },
    dag=dag
)

What is the best way to do this?

I know how to do this using bash operator,but want to know if we can use hive operator

hive_ex = BashOperator(
    task_id='hive-ex',
    bash_command='hive -f hive.sql -DAY={{ ds }} >> {{ file_path }} 
    /file_{{ds}}.json',
    dag=dag
)

回答1:


Since it is a pretty custom use-case the best way is to extend the Hive operator (or create your own Hive2CSVOperator). The implementation would depend on whether you have access to hive through CLI or HiveServer2.

Hive CLI

I would try first with configuring the Hive CLI connection and adding the hive_cli_params, as per Hive CLI hook code, and if this doesn't work, extend the Hook (which would give you access to everything).

HiveServer2

There is a separate hook for this case (link). It is a bit more convenient because it has a get_results method (source) or to_csv method (source).

The execute in the operator code could look then similar to this:

def execute():
  ...
  self.hook = HiveServer2Hook(...)
  self.conn = self.hook.get_conn()

  self.conn.to_csv(hql=self.hql, csv_filepath=self.output_filepath, ...)



回答2:


you need airflow hooks. see Hooks and HiveHook, there's a to_csv method or you can use get_records method and then do it yourself.



来源:https://stackoverflow.com/questions/52322905/use-airflow-hive-operator-and-output-to-a-text-file

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!