kerberos

Spark Structured Streaming with secured Kafka throwing : Not authorized to access group exception

扶醉桌前 提交于 2020-01-24 07:43:24
问题 In order to use structured streaming in my project, I am testing spark 2.2.0 and Kafka 0.10.1 integration with Kerberos on my hortonworks 2.6.3 environment, I am running below sample code to check the integration. I am able to run the below program on IntelliJ on spark local mode with no issues, but the same program when moved to yarn cluster/client mode on Hadoop cluster it get throws below exception. I know I can configure kafka acl for group-id, but spark structured streaming generates new

Spark Structured Streaming with secured Kafka throwing : Not authorized to access group exception

自作多情 提交于 2020-01-24 07:41:33
问题 In order to use structured streaming in my project, I am testing spark 2.2.0 and Kafka 0.10.1 integration with Kerberos on my hortonworks 2.6.3 environment, I am running below sample code to check the integration. I am able to run the below program on IntelliJ on spark local mode with no issues, but the same program when moved to yarn cluster/client mode on Hadoop cluster it get throws below exception. I know I can configure kafka acl for group-id, but spark structured streaming generates new

kdc单机kerberos认证的hdfs开发环境

喜欢而已 提交于 2020-01-24 04:36:42
开发中需要测试kerberos认证的hdfs环境,方便模拟线上环境,hdfs单机很简单,但是加上kerberos,一方面时配置复杂,另一方面时java程序连接认证容易出错,所以总结了快速搭建kerberos认证的hdfs环境,方便开发与测试 centos 6.10 minimal安装 先安装kerberos yum - y install krb5 - libs krb5 - server krb5 - workstation echo '192.168.127.131 myli' >> / etc / hosts # hostname,主机名使用ip,不用 127 echo '192.168.127.131 kerberos.example.com' >> / etc / hosts kdb5_util create - r EXAMPLE . COM - s # 另一个终端 cat / dev / sda > / dev / urandom,往随机池写入,加快速度,新建密码 kadmin . local - q "addprinc admin/admin" # 管理员,新建密码 / etc / init . d / krb5kdc start / etc / init . d / kadmin start kadmin . local - q 'addprinc

Asp.net delegation

流过昼夜 提交于 2020-01-23 08:11:11
问题 I am making a .Net Web API that gets data by calling an SQL server. The user is authenticated via Windows Authentication (Kerberos). I would like the user credentials to be passed to the SQL server via delegation, but the SQL server sees an anonymous user. This is what I have done: IIS application: Windows Authentication and asp.net impersonation enabled. Anonymous and forms authentication disabled. Enable kernel mode authentication is checked. Providers: Negotiate, Kerberos. Use app pool

Spark SQL Thrift Server 配置 Kerberos身份认证和权限管理

血红的双手。 提交于 2020-01-21 21:55:28
  转载请注明出处: http://www.cnblogs.com/xiaodf/   之前的博客介绍了通过Kerberos + Sentry的方式实现了hive server2的身份认证和权限管理功能,本文主要介绍Spark SQL JDBC方式操作Hive库时的身份认证和权限管理实现。  ThriftServer是一个JDBC/ODBC接口,用户可以通过JDBC/ODBC连接ThriftServer来访问SparkSQL的数据。ThriftServer在启动的时候,会启动了一个sparkSQL的应用程序,而通过JDBC/ODBC连接进来的客户端共同分享这个sparkSQL应用程序的资源,也就是说不同的用户之间可以共享数据;ThriftServer启动时还开启一个侦听器,等待JDBC客户端的连接和提交查询。所以,在配置ThriftServer的时候,至少要配置ThriftServer的主机名和端口,如果要使用hive数据的话,还要提供hive metastore的uris。 前提:   本文是在以下几个部署前提下进行的实验:   (1)CDH 开启了Kerberos身份认证,并安装了Sentry;   (2)Hive权限通过Sentry服务控制;   (3)HDFS开启了HDFS ACL与Sentry的权限同步功能,通过sql语句更改Hive表的权限,会同步到相应的HDFS文件。

Kerberos: check sum failed issue

故事扮演 提交于 2020-01-16 19:23:09
问题 I am seeing the" KrbException: Checksum failed" Exception. Looks like kerberos issue but I am not able to figure out. Any pointers on how to resolve will be great! Thanks in advance. Machine details: lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 12.04.4 LTS Release: 12.04 java -version java version "1.7.0_55" OpenJDK Runtime Environment (IcedTea 2.4.7) (7u55-2.4.7-1ubuntu1~0.12.04.2) OpenJDK 64-Bit Server VM (build 24.51-b03, mixed mode) 2014-06-17 22

CDH5.12.1集群安装配置

ぐ巨炮叔叔 提交于 2020-01-16 08:46:49
CDH5.12.1&Kerberos 安装配置 环境: 操作系统:CentOS 7 JDK 版本:1.8.144 所需安装包及版本说明:由于我们的操作系统为CentOS7,需要下载以下文件: 下载地址: http://archive.cloudera.com/cm5/cm/5/ cloudera-manager-centos7-cm5.12.1_x86_64.tar.gz 下载地址: http://archive.cloudera.com/cdh5/parcels/5.12.1/ CDH-5.12.1-1.cdh5.12.1.p0.3-el7.parcel CDH-5.12.1-1.cdh5.12.1.p0.3-el7.parcel.sha1 manifest.json IP地址 主机名 角色名称 部署软件 192.168.1.25 node5 Master jdk、cloudera-manager、MySql、krb5kdc、kadmin 192.168.1.21 node1 node jdk、cloudera-manager 192.168.1.22 node2 node jdk、cloudera-manager 192.168.1.23 node3 node jdk、cloudera-manager 192.168.1.24 node4 node jdk、cloudera

Malformed PAC logon info on new KerberosToken

寵の児 提交于 2020-01-16 06:06:43
问题 I'm using the code here to get authentication information from a Kerberos token. In there I've configured the domainUsername and domainUserPassword and just ran it as specified in the readme.md. Then, from a browser that is in the AD domain, I connect to http://server:8080/spnego and I see on the opened page my username@domain. The page should also contain the SID of the AD groups to which my user belongs. Looking at the server logs, I see: org.jaaslounge.decoding.DecodingException: Malformed

飞腾安装Kerberos

五迷三道 提交于 2020-01-16 05:57:37
准备安装包 krb5-admin-server_1.13.2+dfsg-5ubuntu2.1_arm64.deb krb5-kdc_1.13.2+dfsg-5ubuntu2.1_arm64.deb 源 dpkg-scanpackages -t deb . | gzip -9c > Packages.gz apt update 安装 apt install krb5-kdc krb5-admin-server 配置 vim /etc/krb5.conf vim /etc/krb5kdc/kdc.conf vim /etc/krb5kdc/kadm5.acl 创建数据库 kdb5_util create -s -r BIGDATA 创建管理员账户 kadmin.local -q "addprinc -pw admin admin/admin" 启动服务 service krb5-kdc start service krb5-admin-server start 来源: CSDN 作者: 建康 链接: https://blog.csdn.net/qq_29989725/article/details/103992929

Java访问kerberos认证的HDFS文件

喜夏-厌秋 提交于 2020-01-15 21:07:59
Kerberos 是一种计算机网络授权协议,用来在非安全 网络 中,对个人通信以安全的手段进行身份认证。 具体HADOOP的访问HDFS使用 Kerberos的作用和原理请自己查阅相关文档。 之前做项目时第一次使用Kbs访问HDFS,当时不了解,翻阅资料搞了好久,也入了不少坑,现分享出来,方便大家。 下面代码在项目亲测过,可用 代码如下: package zqmKerberos; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FileStatus; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.security.UserGroupInformation; import java.io.BufferedReader; import java.io.IOException; import java.io.InputStream; import java.io.InputStreamReader; import java.text.SimpleDateFormat; import java.util.Date; import java.util.HashMap; import java.util.UUID;