hortonworks-data-platform

Error “No such container sandbox-hdp” when trying to install docker image on RHEL7

▼魔方 西西 提交于 2021-01-29 05:12:27
问题 I am trying to get the HDP sandbox running on RHEL7. I am however getting "no such container sandbox-hdp" error message when I try to run docker-deploy-hdp30.sh. sudo sh docker-deploy-hdp30.sh + registry=hortonworks + name=sandbox-hdp + version=3.0.1 + proxyName=sandbox-proxy + proxyVersion=1.0 + flavor=hdp + echo hdp + mkdir -p sandbox/proxy/conf.d + mkdir -p sandbox/proxy/conf.stream.d + docker pull hortonworks/sandbox-hdp:3.0.1 3.0.1: Pulling from hortonworks/sandbox-hdp 70799bbf2226: Pull

Error “No such container sandbox-hdp” when trying to install docker image on RHEL7

谁说胖子不能爱 提交于 2021-01-29 05:10:25
问题 I am trying to get the HDP sandbox running on RHEL7. I am however getting "no such container sandbox-hdp" error message when I try to run docker-deploy-hdp30.sh. sudo sh docker-deploy-hdp30.sh + registry=hortonworks + name=sandbox-hdp + version=3.0.1 + proxyName=sandbox-proxy + proxyVersion=1.0 + flavor=hdp + echo hdp + mkdir -p sandbox/proxy/conf.d + mkdir -p sandbox/proxy/conf.stream.d + docker pull hortonworks/sandbox-hdp:3.0.1 3.0.1: Pulling from hortonworks/sandbox-hdp 70799bbf2226: Pull

Hive: Sum over a specified group (HiveQL)

五迷三道 提交于 2020-12-28 07:45:40
问题 I have a table: key product_code cost 1 UK 20 1 US 10 1 EU 5 2 UK 3 2 EU 6 I would like to find the sum of all products for each group of "key" and append to each row. For example for key = 1, find the sum of costs of all products (20+10+5=35) and then append result to all rows which correspond to the key = 1. So end result: key product_code cost total_costs 1 UK 20 35 1 US 10 35 1 EU 5 35 2 UK 3 9 2 EU 6 9 I would prefer to do this without using a sub-join as this would be inefficient. My

Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)

陌路散爱 提交于 2020-07-09 14:52:35
问题 I have installed Kafka and doing some basic testing. I am able to create topics using scripts provided under Kafka-broker/bin folder. But when I am trying to produce message getting below WARNing every time I run this. And no message is getting generated. Please advice. [root@node2 bin]# ./kafka-console-producer.sh --broker-list localhost:9092 --topic test_master >testmsg1 [2019-05-15 06:25:19,092] WARN [Producer clientId=console-producer] Connection to node -1 could not be established.

Kafka, new storage

断了今生、忘了曾经 提交于 2020-01-16 06:46:10
问题 I'm trying to add new storage for Kafka, here is what I have already done: Add, prepare and mount storage under Linux OS Add new storage in Kafka Broker: log.dirs: /data0/kafka-logs,/data1/kafka-logs Restart Kafka Brokers New directories under /data1/kafka-logs has been created but the size is: du -csh /data1/kafka-logs/ 156K /data1/kafka-logs/ And the size isn't growing only the old /data0 is used. What I'm missing? What should I do more to solve this problem? The storage is almost full, and

Dynamically create the version number within the Ambari's metainfo.xml file using maven build processes

限于喜欢 提交于 2020-01-16 04:00:18
问题 I don’t want to hardcode my service version into metainfo.xml, Can I do it? <service> <name>DUMMY_APP</name> <displayName>My Dummy APP</displayName> <comment>This is a distributed app.</comment> <version>0.1</version> --------------This I don't want to hardcode, Can I doit? <components> ... </components> </service> I am using maven as my build tool. 回答1: This can be done by using maven's resource filtering. Three steps are required: Define a maven property that will hold the version number

Cannot connect to remote HDFS from Windows

北战南征 提交于 2020-01-16 01:02:09
问题 i am trying to connect to a remote HDFS instance as Configuration conf = new Configuration(); conf.set("fs.defaultFS", "hdfs://hostName:8020"); conf.set("fs.hdfs.impl", "org.apache.hadoop.hdfs.DistributedFileSystem"); FileSystem fs = FileSystem.get(conf); RemoteIterator<LocatedFileStatus> ri = fs.listFiles(fs.getHomeDirectory(), false); while (ri.hasNext()) { LocatedFileStatus lfs = ri.next(); //log.debug(lfs.getPath().toString()); } fs.close(); here are my Maven dependencies <dependency>