Unable to write data on hive using spark

拟墨画扇 提交于 2019-12-11 17:09:41

问题


I am using spark1.6. I am creating hivecontext using spark context. When I save the data into hive it gives error. I am using cloudera vm. My hive is inside cloudera vm and spark in on my system. I can access the vm using IP. I have started the thrift server and hiveserver2 on vm. I have user thrift server uri for hive.metastore.uris

 val hiveContext = new HiveContext(sc)
    hiveContext.setConf("hive.metastore.uris", "thrift://IP:9083")
............
............
 df.write.mode(SaveMode.Append).insertInto("test")

I get the following error:

FAILED: SemanticException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClien‌​t

回答1:


Probably inside spark conf folder, hive-site.xml is not available , I have added the details below.

Adding hive-site.xml inside spark configuration folder.

creating a symlink which points to hive-site.xml in hive configuration folder.

sudo ln -s /usr/lib/hive/conf/hive-site.xml /usr/lib/spark/conf/hive-site.xml

after the above steps, restarting spark-shell should help.



来源:https://stackoverflow.com/questions/46197712/unable-to-write-data-on-hive-using-spark

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!