Migrate hive table to Google BigQuery

☆樱花仙子☆ 提交于 2020-02-11 08:52:52

问题


I am trying to design a sort of data pipeline to migrate my Hive tables into BigQuery. Hive is running on an Hadoop on premise cluster. This is my current design, actually, it is very easy, it is just a shell script:

for each table source_hive_table {

  • INSERT overwrite table target_avro_hive_table SELECT * FROM source_hive_table;
  • Move the resulting avro files into google cloud storage using distcp
  • Create first BQ table: bq load --source_format=AVRO your_dataset.something something.avro
  • Handle any casting issue from BigQuery itself, so selecting from the table just written and handling manually any casting

}

Do you think it makes sense? Is there any better way, perhaps using Spark? I am not happy about the way I am handling the casting, I would like to avoid creating the BigQuery table twice.


回答1:


Yes, your migration logic makes sense.

I personally prefer to do the CAST for specific types directly into the initial "Hive query" that generates your Avro (Hive) data. For instance, "decimal" type in Hive maps to the Avro 'type': "type":"bytes","logicalType":"decimal","precision":10,"scale":2

And BQ will just take the primary type (here "bytes") instead of the logicalType. So that is why I find it easier to cast directly in Hive (here to "double"). Same problem happens to the date-hive type.



来源:https://stackoverflow.com/questions/46958916/migrate-hive-table-to-google-bigquery

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!