kerberos

Will the hbase kerberos token expired

喜夏-厌秋 提交于 2019-12-12 06:24:13
问题 I have a spark streaming application, for every batch, I need to insert it to the hbase which is protected by kerberos. I found a solution, that is in the driver side I create a connection and obtain a token from that conn and then pass it to the executor. In the executor side, I decode it and get the token, in this way I can insert data to hbase successfully. This seems good, but my concern is that will the token expired? If so , how to solve it please? My code snippet is val ugi

Configure Sentry to show/hide different databases for different users

前提是你 提交于 2019-12-12 05:16:10
问题 I have a cluster running with cdh-5.7.0 and configured the following setup hadoop with kerberos hive with LDAP authentication hive with sentry authorization (rules stored in JDBC derby) My goal is to restrict users to see which databases exist in my system. E.g.: User-A should only see database DB-A when execute show databases User-B should only see database DB-B when execute show databases I followed the article https://blog.cloudera.com/blog/2013/12/how-to-get-started-with-sentry-in-hive/

Bad HTTP response returned from the server. Code 500

╄→尐↘猪︶ㄣ 提交于 2019-12-12 04:52:29
问题 I have a problem to use pywinrm on linux, to get a PowerShell Session. I read several posts and questions on sites about that. But any that can solve my question. The error is in the Kerberos autentication. This is my krb5.conf: 0 [libdefaults] 1 default_realm = DOMAIN.COM.BR 2 ticket_lifetime = 24000 3 clock-skew = 300 4 dns_lookup_kdc = true 5 6 # [realms] 7 # LABCORP.CAIXA.GOV.BR = { 8 # kdc = DOMAIN.COM.BR 9 # kdc = DOMAIN.COM.BR 10 # admin_server = DOMAIN.COM.BR 11 # default_domain =

Java SSO using SPNEGO

一个人想着一个人 提交于 2019-12-12 04:35:03
问题 I'm newbie in this topic. I need help to implement the authentication Java SSO for a web application over Tomcat 6.0.29. I have read about SPNEGO and proven the examples helloKDC.java and hello_spnego.jsp at http://spnego.sourceforge.net/ wich worked well,but I don't know what are the steps I have to follow for implement the solution. i.e. is enough to get the name of remote user? or I have to do something else for assure its identity and keep on the session of the current user?. 来源: https:/

SPNEGO on IBM WebSphere Portal 6.1 with https

余生颓废 提交于 2019-12-12 04:27:47
问题 I configured IBM WebSphere Portal 6.1 on WAS7: SPNEGO, ssl with self signed certificate, default http transport (without Web Server) and changed default ports 10039, 10029 to 80, 443. After that SPNEGO works fine on http, on https displayed standard login form. Where there may be a mistake? 回答1: Did you take a look at this document: WebSphere Portal Windows SSO w/SPNEGO Mapping the user to the Kerberos Service Principal Name (SPN) When you run the setspn and the ltpass commands there should

How to run a Spark test from IntelliJ (or other IDE)

时光总嘲笑我的痴心妄想 提交于 2019-12-12 04:22:37
问题 I am trying to create a Test for some Spark code. The following code fails when getting a SparkSession object. NOTE: The test runs fine when running from the cli: gradle my_module:build @Test def myTest(): Unit = { val spark = SparkSession.builder().master("local[2]").getOrCreate() ... } Error: java.lang.IllegalArgumentException: Can't get Kerberos realm ... Caused by: java.lang.reflect.InvocationTargetException ... Caused by: KrbException: Cannot locate default realm My set-up: IntelliJ +

Renewing a connection to Apache Phoenix (using Kerberos) fails after exactly 10 hours

喜欢而已 提交于 2019-12-12 04:16:07
问题 I have a Java application with possibility to make some SQL select statements from Apache Phoenix. For this i'm using a principle with a keytab to create the connection. This is the class that support the connection : public class PhoenixDriverConnect { private static Connection conn; private static final Logger logger = LoggerFactory.getLogger(PhoenixDriverConnect.class); private PhoenixDriverConnect(String DB_URL) { GetProperties getProperties = new GetProperties(); try { Class.forName

How to successfully make a hive jdbc call inside a mapper in MR job where the cluster is secured by Kerberos

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-12 04:08:59
问题 I am writing a utility that is a map reduce job where the reducer makes calls to various databases and Hive is one of them. Our cluster is kerberized. I am doing kinit before kicking off the MR job, but when the reducer runs, it fails with an error "No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)" This indicates that it doesnt have a valid ticket. I tried to get a delegation token for Hive service in the MR driver, but it failed because the Hive service

Accessing a kererized remote HBASE cluster from Spark

不问归期 提交于 2019-12-12 03:47:48
问题 I'm attempting to read data from a kerberized HBASE instance from Spark using the Hortonworks SPARK-ON-HBASE connector. My cluster configuration essentially looks like this: I am submitting my spark jobs from a client machine to a remote Spark standalone cluster, and that job is attempting to read data from a seperate HBASE cluster. If I bypass the standalone cluster by running Spark with master=local[*] directly on my client, I can access the remote HBASE cluster no problem as long as I

How to do kerberos authentication on a flink standalone installation?

十年热恋 提交于 2019-12-12 03:26:17
问题 I have a standalone Flink installation on top of which I want to run a streaming job that is writing data into a HDFS installation. The HDFS installation is part of a Cloudera deployment and requires Kerberos authentication in order to read and write the HDFS. Since I found no documentation on how to make Flink connect with a Kerberos-protected HDFS I had to make some educated guesses about the procedure. Here is what I did so far: I created a keytab file for my user. In my Flink job, I added