HDFS as volume in cloudera quickstart docker

一曲冷凌霜 提交于 2019-12-05 16:54:06

You should run a

docker exec -it "YOUR CLOUDERA CONTAINER" chown -R hdfs:hadoop /var/lib/hadoop-hdfs/ 

I've the same problem and I've managed the situation copying the entire /var/lib directory from container to a local directory

From terminal, start the cloudera/quickstart container without start all hadoop services:

docker run -ti cloudera/quickstart /bin/bash

In another terminal copy the container directory to the local directory :

mkdir /local_var_lib
docker exec your_container_id tar Ccf $(dirname /var/lib) - $(basename /var/lib) | tar Cxf /local_var_lib -

After all files copied from container to local dir, stop the container and point the /var/lib to the new target. Make sure the /local_var_lib directory contains the hadoop directories (hbase, hadoop-hdfs, oozie, mysql, etc).

Start the container:

docker run --name cloudera \
  --hostname=quickstart.cloudera \
  --privileged=true \
  -td \
  -p 2181:2181 \
  -p 8888:8888 \
  -p 7180:7180 \
  -p 6680:80 \
  -p 7187:7187 \
  -p 8079:8079 \
  -p 8080:8080 \
  -p 8085:8085 \
  -p 8400:8400 \
  -p 8161:8161 \
  -p 9090:9090 \
  -p 9095:9095 \
  -p 60000:60000 \
  -p 60010:60010 \
  -p 60020:60020 \
  -p 60030:60030 \
  -v /local_var_lib:/var/lib \
  cloudera/quickstart /usr/bin/docker-quickstart
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!