Cloudera

During add service hdfs in cloudera manager, I am getting error message failed to create tmp directory

為{幸葍}努か 提交于 2020-01-25 09:09:06
问题 I installed cloudera manager 5.12.0 successfully . But when I try to add hdfs service, I got following error message : failed to create tmp directory . But it was installed. Then try to restart cluster. But if failed to start again . It says no data node role are associated with hdfs . At least one node is required. Could anyone please tell me why I am getting error message like this ? How to solve the problem. I am using centos 7. Thanks 来源: https://stackoverflow.com/questions/51485563

Transpose dataset in Hive

我的未来我决定 提交于 2020-01-25 07:58:48
问题 I'm trying to transpose a variable in Hive such as: Id1 Id2 Event 1 1 7 2 2 3 2 2 7 to Id1 Id2 Event_7 Event_3 1 1 1 2 2 1 1 Following is what I have so far: create temporary table event_trans as select Id1, Id2,Event kv['3'] as Event_3, kv['7'] as Event_7 from( select Id1, Id2, collect(Event, '1') as kv from event1 group by Id1, Id2 )t Error: Error while compiling statement: FAILED: ParseException line 1:84 missing EOF at '[' near 'kv' I'm also interested to know how to transpose a dataset

CDH-5.14.2集群升级Hive-1.1.0至Hive-1.2.1

不羁的心 提交于 2020-01-22 08:00:48
参考: CDH 5.1.5(parcels)集群中hive1.1.0升级到hive-1.2.1步骤全,升级hive元数据库,数据不丢失(亲测可用) 操作步骤: 下载 hive-1.2.1-bin 解压: [ root@node01 ~ ] # cd /opt/software/ [ root@node01 software ] # ls apache-hive-1.2.1-bin.tar.gz cloudera-manager-centos7-cm5.14.2_x86_64.tar.gz jdk-8u231-linux-x64.tar.gz maxwell-1.22.1.tar.gz apache-phoenix-4.14.0-cdh5.14.2-bin.tar.gz flink-1.9.1-cdh-5.14.2.tar.gz kafka-manager-1.3.1.6.zip mysql-connector-java.jar [ root@node01 software ] # tar -zxf apache-hive-1.2.1-bin.tar.gz -C /opt/module/ [ root@node01 software ] # cd /opt/module/ [ root@node01 module ] # mv apache-hive-1.2.1-bin hive-1

编译Spark源码,hadoop.version=2.6.0-cdh5.16.2

不问归期 提交于 2020-01-22 01:38:11
1>到官网上下载Spark源代码 2>进入到该目录下 3>修改该目录下的pom.xml,新增如下代码 <repository> <id>cloudera</id> <name>cloudera Repository</name> <url>https://repository.cloudera.com/artifactory/cloudera-repos</url> </repository> 4>编译代码 ./dev/make-distribution.sh --name 2.6.0-cdh5.16.2 --tgz -Phadoop-2.6 -Dhadoop.version=2.6.0-cdh5.16.2 -Phive -Phive-thriftserver -Pmesos -Pyarn -Pkubernetes 5>等待一段时间,第一次编译速度较慢,会在网上下载所需资源 6>解压即可 来源: CSDN 作者: 应龙与巨蜥 链接: https://blog.csdn.net/weixin_42209440/article/details/104060684

CDH5.12.1集群安装配置

ぐ巨炮叔叔 提交于 2020-01-16 08:46:49
CDH5.12.1&Kerberos 安装配置 环境: 操作系统:CentOS 7 JDK 版本:1.8.144 所需安装包及版本说明:由于我们的操作系统为CentOS7,需要下载以下文件: 下载地址: http://archive.cloudera.com/cm5/cm/5/ cloudera-manager-centos7-cm5.12.1_x86_64.tar.gz 下载地址: http://archive.cloudera.com/cdh5/parcels/5.12.1/ CDH-5.12.1-1.cdh5.12.1.p0.3-el7.parcel CDH-5.12.1-1.cdh5.12.1.p0.3-el7.parcel.sha1 manifest.json IP地址 主机名 角色名称 部署软件 192.168.1.25 node5 Master jdk、cloudera-manager、MySql、krb5kdc、kadmin 192.168.1.21 node1 node jdk、cloudera-manager 192.168.1.22 node2 node jdk、cloudera-manager 192.168.1.23 node3 node jdk、cloudera-manager 192.168.1.24 node4 node jdk、cloudera

how to use external jars in Cloudera hadoop?

Deadly 提交于 2020-01-15 11:09:32
问题 i have a cloudera hadoop version 4 installed on my cluster. It comes packaged with google protobuffer jar version 2.4. in my application code i use protobuffer classes compiled with protobuffer version 2.5. This causes unresolved compilation problems at run time. Is there a way to run the map reduce jobs with an external jar or am i stuck until cloudera upgrades their service? Thanks. 回答1: Yes you can run MR jobs with external jars. Be sure to add any dependencies to both the HADOOP_CLASSPATH

how to use external jars in Cloudera hadoop?

强颜欢笑 提交于 2020-01-15 11:08:22
问题 i have a cloudera hadoop version 4 installed on my cluster. It comes packaged with google protobuffer jar version 2.4. in my application code i use protobuffer classes compiled with protobuffer version 2.5. This causes unresolved compilation problems at run time. Is there a way to run the map reduce jobs with an external jar or am i stuck until cloudera upgrades their service? Thanks. 回答1: Yes you can run MR jobs with external jars. Be sure to add any dependencies to both the HADOOP_CLASSPATH

Cloudera Hive: Where to add json-serde-1.3.7 jar file

半世苍凉 提交于 2020-01-15 09:27:50
问题 I am using cloudera 5.8.0 First I run this command: hive> ADD JAR /usr/lib/hive/lib/hive-serdes-1.0-SNAPSHOT.jar; Added [/usr/lib/hive/lib/hive-serdes-1.0-SNAPSHOT.jar] to class path Added resources: [/usr/lib/hive/lib/hive-serdes-1.0-SNAPSHOT.jar] And than I add the json-serde-1.3.7 jar file hive> ADD JAR /usr/lib/hive/lib/json-serde-1.3.7-jar-with- dependencies.jar; Added [/usr/lib/hive/lib/json-serde-1.3.7-jar-with-dependencies.jar] to class path Added resources: [/usr/lib/hive/lib/json

Maven Project in Eclipse - Initial Setup

£可爱£侵袭症+ 提交于 2020-01-14 05:50:10
问题 I am trying to follow some existing tutorials on Hadoop that I found online. I have installed cloudera and am using it as the setup environment for all of the work. However when I try to create a Maven project in Eclipse, I end up having issues referencing to the quickstart archetype. The error I get is the following: 'Creating maven-archetype-quickstart' has encountered a problem. Could not resolve archetype org.apache.maven.archetypes:maven-archetype-quickstart:RELEASE from any of the

0486-如何将Kerberos的CDH5.16.1从Oracle JDK 1.8迁移至OpenJDK 1.8

给你一囗甜甜゛ 提交于 2020-01-12 15:57:05
温馨提示:如果使用电脑查看图片不清晰,可以使用手机打开文章单击文中的图片放大查看高清原图。 Fayson的github: https://github.com/fayson/cdhproject 提示:代码块部分可以左右滑动查看噢 1 文档编写目的 受前段时间Oracle官宣的从2019年1月之后将不再提供免费的的JDK商业版本的影响,Cloudera开始开发基于OpenJDK的Hadoop平台,参考Fayson之前的文章《 Java收费,Hadoop怎么办? 》。今年11月29日,Cloudera才发布不久的CDH5.16.1正式提供OpenJDK的支持,参考Fayson之前的文章《 0466-CDH5.16.1和CM5.16.1的新功能 》。本文Fayson主要介绍如何将CDH从Oracle JDK迁移到OpenJDK。 JDK的迁移需要重启整个集群,所以对于所有主机的重启你需要规划停机时间。如果你的集群启用了HDFS HA,可以使用滚动重启而不用规划停机时间。 内容概述 1.CDH各版本的JDK支持说明 2.迁移JDK 3.检查JDK的使用版本 4.组件功能校验 5.总结 测试环境 1.CM和CDH版本为5.16.1 2.采用root用户操作 3.Redhat7.4 2 CDH各版本的JDK支持说明 Cloudera Manager和CDH需要所有节点都安装了受支持的Java