Kinit with Spark when connecting to Hive

风格不统一 提交于 2019-12-24 06:13:40

问题


I am trying to connect to Hive(hadoop cluster has kerberos authentication) from Spark which is Standalone.

Can someone let me know how to do kinit in spark program i could connect to hive?

UPDATE:My Spark is on different cluster from Hadoop


回答1:


Assuming you have a spark-shell open and you don't want to exit, and then re-kinit you could do something like this:

import java.lang.ProcessBuilder
import java.io.PrintWriter

//resets your kerberos login
val p1 = Runtime.getRuntime.exec("kdestroy")
p1.waitFor
//executes kinit, 
val p = Runtime.getRuntime.exec("kinit")
val stdin = p.getOutputStream
val pw =new PrintWriter(stdin)
//val pwd = get_password() //get_password() is a function to get your password from a file, or wherever
pw.println(pwd) // you could put your password here , but plain text passwords are generally frowned upon
pw.close
p.waitFor



回答2:


you should run the kinit command before init the spark-shell or submit a spark app (spark-submit). The other option is get the kerberos ticket using a keytab and run your spark program like this

bash -c "kinit -kt /path/to/key/mykey.keytab myuser@KERBEROS.SERVER.COM; spark-shell"

regards,



来源:https://stackoverflow.com/questions/43722598/kinit-with-spark-when-connecting-to-hive

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!