Hadoop: require root's password after enter “start-all.sh”

后端 未结 6 1319
执念已碎
执念已碎 2020-12-14 19:08

I have installed Hadoop and SSH on my laptop. \"ssh localhost\" works fine. After formatting HDFS, I tried to start hadoop.

munichong@GrindPad:~$ sudo /usr/         


        
相关标签:
6条回答
  • 2020-12-14 19:45

    It seems you have logged-in as root and invoking start-all.sh.

    Instead, login as owner of directory $SPARK_HOME and invoke spark's
    start-all.sh.

    (or)

    Let user hadoop be the owner of directory $SPARK_HOME and currently logged in as root, then command would be as follows:

    sudo -u hadoop -c start-all.sh
    

    Assumption:
    a) PATH has reference to directory $SPARK_HOME/bin
    b) Certificate based authentication is configured for user hadoop

    0 讨论(0)
  • 2020-12-14 19:47

    I ran into the same problem. As Amar said,if you are running as sudo hadoop will ask for root password. If you don't have a root password, you can setup one using

     sudo passwd
    

    below URL gives you more detail about user management.

    https://help.ubuntu.com/12.04/serverguide/user-management.html

    0 讨论(0)
  • 2020-12-14 19:56

    Create and Setup SSH Certificates Hadoop requires SSH access to manage its nodes, i.e. remote machines plus our local machine. For our single-node setup of Hadoop, we therefore need to configure SSH access to localhost.

    So, we need to have SSH up and running on our machine and configured it to allow SSH public key authentication.

    Hadoop uses SSH (to access its nodes) which would normally require the user to enter a password. However, this requirement can be eliminated by creating and setting up SSH certificates using the following commands. If asked for a filename just leave it blank and press the enter key to continue.

    check this site

    0 讨论(0)
  • 2020-12-14 20:01

    Solution:

    1) Generate ssh key without password

    $ ssh-keygen -t rsa -P ""
    

    2) Copy id_rsa.pub to authorized-keys

    $  cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys
    

    3) Start ssh localhost

    $ ssh localhost
    

    4) now go to the hadoop sbin directory and start hadoop

    $./start-all.sh 
    ./start-all.sh
    This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
    Starting namenodes on [localhost]
    localhost: starting namenode, logging to /home/amtex/Documents/installed/hadoop/logs/hadoop-amtex-namenode-amtex-desktop.out
    localhost: starting datanode, logging to /home/amtex/Documents/installed/hadoop/logs/hadoop-amtex-datanode-amtex-desktop.out
    Starting secondary namenodes [0.0.0.0]
    0.0.0.0: starting secondarynamenode, logging to /home/amtex/Documents/installed/hadoop/logs/hadoop-amtex-secondarynamenode-amtex-desktop.out
    starting yarn daemons
    starting resourcemanager, logging to /home/amtex/Documents/installed/hadoop/logs/yarn-amtex-resourcemanager-amtex-desktop.out
    localhost: starting nodemanager, logging to /home/amtex/Documents/installed/hadoop/logs/yarn-amtex-nodemanager-amtex-desktop.out
    

    5)password not asking

    $ jps 
    12373 Jps
    11823 SecondaryNameNode
    11643 DataNode
    12278 NodeManager
    11974 ResourceManager
    11499 NameNode
    
    0 讨论(0)
  • 2020-12-14 20:06

    As in above case munichong is a user (munichong@GrindPad)

    1. In my case: Login as hduser

    2. Firstly, remove the directorysudo rm -rf ~/.ssh

    3. Use to re-generate /.ssh directory with default setting:

      [hduser@localhost ~]$ ssh-keygen
      
    4. Here we do copy and paste the content of id_rsa.pub into authorised_keys file created by using above command)

      [hduser@localhost ~]$ sudo cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
      
    5. [hduser@localhost ~]$ chmod -R 750 ~/.ssh/authorized_keys

    6. [hduser@localhost ~]$ ssh localhost

      The authenticity of host 'localhost (127.0.0.1)' can't be established. RSA key fingerprint is 04:e8:80:64:dc:71:b5:2f:c0:d9:28:86:1f:61:60:8a. Are you sure you want to continue connecting (yes/no)? yes

      Warning: Permanently added 'localhost' (RSA) to the list of known hosts. Last login: Mon Jan 4 14:31:05 2016 from localhost.localdomain

    7. [hduser@localhost ~]$ jps
      18531 Jps

    8. [hduser@localhost ~]$ start-all.sh

    9. All daemons start

    Note: Sometime due to logs files other problem occur, in that case remove only dot out (.out) files from /usr/local/hadoop/logs/.

    0 讨论(0)
  • 2020-12-14 20:08

    log in super user or root

    :~ su
    
    Password:
    

    give permission to user

    :~ sudo chown -R <log in user> /usr/local/hadoop/
    

    for your example log in user: munichong

    HADOOP_HOME = /usr/local/hadoop/

    0 讨论(0)
提交回复
热议问题