#/root @ master/CD/usr/hadoop/hadoop - 2.9.2
[root @ master hadoop - 2.9.2] # sbin/start - all. Sh
Home This script is Deprecated. Use the start - DFS. Sh and start - yarn. Sh
Starting namenodes on [master]
The authenticity of host 'master (192.168.59.11)' can 't be established.
ECDSA key fingerprint is SHA256: vSl5FOcIM4OX19urpL/iIO06fD1xvseqMHRjDNv3RS8.
ECDSA key fingerprint is MD5:51: a0:8 f: 08:3 b: 85-60, 62:91-59: dc: d5:7 e: 9 c: b8:3 c.
Are you sure you want to continue connecting (yes/no)? ^ Cmaster: Host key verification failed.
Branch: bash:/usr/hadoop/hadoop - 2.9.2/sbin/hadoop - daemon. Sh: No to the file or directory
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can 't be established.
ECDSA key fingerprint is SHA256: vSl5FOcIM4OX19urpL/iIO06fD1xvseqMHRjDNv3RS8.
ECDSA key fingerprint is MD5:51: a0:8 f: 08:3 b: 85-60, 62:91-59: dc: d5:7 e: 9 c: b8:3 c.
Are you sure you want to continue connecting (yes/no)? Yes
0.0.0.0: Warning: Permanently added '0.0.0.0 (ECDSA) to the list of known hosts.
[email protected] 's password:
0.0.0.0: starting secondarynamenode, logging to/usr/hadoop/hadoop - 2.9.2/logs/hadoop - root - secondarynamenode - master. Out
Starting yarn daemons
Starting the resourcemanager, logging to/usr/hadoop/hadoop - 2.9.2/logs/yarn - root - the resourcemanager - master. Out
Branch: bash:/usr/hadoop/hadoop - 2.9.2/sbin/yarn - daemon. Sh: No to the file or directory
[root @ master hadoop - 2.9.2] # sbin/start - all. Sh
Home This script is Deprecated. Use the start - DFS. Sh and start - yarn. Sh
Starting namenodes on [master]
The authenticity of host 'master (192.168.59.11)' can 't be established.
ECDSA key fingerprint is SHA256: vSl5FOcIM4OX19urpL/iIO06fD1xvseqMHRjDNv3RS8.
ECDSA key fingerprint is MD5:51: a0:8 f: 08:3 b: 85-60, 62:91-59: dc: d5:7 e: 9 c: b8:3 c.
Are you sure you want to continue connecting (yes/no)? Yes
Master: Warning: Permanently added 'master, 192.168.59.11' (ECDSA) to the list of known hosts.
Root @ master 's password:
Master: starting the namenode, logging to/usr/hadoop/hadoop - 2.9.2/logs/hadoop - root - the namenode - master. Out
Branch: bash:/usr/hadoop/hadoop - 2.9.2/sbin/hadoop - daemon. Sh: No to the file or directory
Starting secondary namenodes [0.0.0.0]
[email protected] 's password:
0.0.0.0: Authentication failed.
Starting yarn daemons
The resourcemanager running as process 11366. Stop it first.
Branch: bash:/usr/hadoop/hadoop - 2.9.2/sbin/yarn - daemon. Sh: No to the file or directory
CodePudding user response:
Many times, that means you are formatted to delete the hadoop metadata stored under the folder, it is ok to reformatCodePudding user response:
Home This script is Deprecated. Use the start - DFS. Sh and start - yarn. Sh1. This command is not up to date and available bin/start - DFS. Sh and bin/start - yarn. Sh replace
2. The format you have any question, the data, the HDFS, TMP, this three files deleted, formatted again, remember that finish delete in format, the file path in your configuration file,
CodePudding user response:
https://blog.csdn.net/weixin_46028577/article/details/106292265Try using this method
CodePudding user response:
The authenticity of host 'master (192.168.59.11)' can 't be established.The authenticity of host '0.0.0.0 (0.0.0.0)' can 't be established.
May be your network or access between machines, ping to check to see
If the configuration was wrong, you can refer to https://blog.csdn.net/dzh284616172/article/details/105980074