Unable to use an existing Hive permanent UDF from Spark SQL

走远了吗. 提交于 2019-12-02 23:47:07

Issue is Spark 2.0 is not able to execute the functions whose JARs are located on HDFS.

Spark SQL: Thriftserver unable to run a registered Hive UDTF

One workaround is to define the function as a temporary function in Spark job with jar path pointing to a local edge-node path. Then call the function in same Spark job.

CREATE TEMPORARY FUNCTION functionName as 'com.test.HiveUDF' USING JAR '/user/home/dir1/functions.jar'
sun_007

It will work on spark on yarn environment however as suggested you need to use spark-shell --jars <path-to-your-hive-udf>.jar not in hdfs but in local.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!