How to load and store nvarchar

自闭症网瘾萝莉.ら 提交于 2019-12-23 05:47:07

问题


​Stack : Installed HDP-2.3.2.0-2950 using Ambari 2.1 The steps that I am following : ​

  1. Load SQL server tables onto HDFS using Sqoop
  2. Create EXTERNAL tables in Hive

I didn't use anything pertaining to charset/unicode/utf-8 while executing the sqoop import commands and the import was successful

While creating the Hive external table, I was wondering what data type shall I select for the nvarchar columns in the original sql server table, now I am worried that even in Sqoop while importing that needs to be addressed.

  1. ​Couldn't find relevant charset/nvarchar etc. options in Sqoop import
  2. In Hive, can varchar/string blindly be used in place of nvarchar

回答1:


We know that nvarchar type is not understandable by sqoop then we need to just cast it as varchar e.g.

select 
CAST(col1 AS varchar )  AS col1,
col2,
col3,
col4
from table_name


来源:https://stackoverflow.com/questions/37033391/how-to-load-and-store-nvarchar

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!