I knew that the slave was changed to worker in Spark configuration.
But When i download Spark-3.2.0, I saw both start-slave.sh and start-worker.sh exist under the 'sbin' folder.
Do both shells do the same thing?
CodePudding user response:
These scripts are the same, slave
has been renamed to worker
https://issues.apache.org/jira/browse/SPARK-32004
I assume that it remains old name for backward-compatibility purposes
compare: 3.2.0 - https://spark.apache.org/docs/3.2.0/spark-standalone.html#starting-a-cluster-manually 3.0.1 0 https://spark.apache.org/docs/3.0.1/spark-standalone.html#starting-a-cluster-manually