I am trying to install hadoop on Mac os using this instruction. At this step sbin/start-dfs.sh
, I have a problem.
My result:
Starting namenodes on [localhost] Starting datanodes localhost: datanode is running as process 26210. Stop it first. Starting secondary namenodes [https://account.jetbrains.com:443] sed: 1: "s/^/https://account.jet ...": bad flag in substitute command: '/'
.
What can be done about it? I really hope for your help!
hadoop-env:
#export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk-13.0.1.jdk/Contents/Home
Generated SSH key
ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
chmod 0600 ~/.ssh/authorized_keys
/etc/hosts:
127.0.0.1 localhost
255.255.255.255 broadcasthost
::1 localhost
0.0.0.0 https://account.jetbrains.com:443
1.2.3.4 account.jetbrains.com
1.2.3.4 http://www.jetbrains.com
1.2.3.4 www-weighted.jetbrains.com
0.0.0.0 account.jetbrains.com
Maybe there is some error in /etc/hosts?
CodePudding user response:
By default, Hadoop assumes that 0.0.0.0
will map to a local connection.
You appear to have tried to bypass JetBrains account checks by modifying your hosts file. Please revert your hosts file like this
127.0.0.1 localhost
255.255.255.255 broadcasthost
::1 localhost