I use maven created project, import a scala - SDK - 2.11.8 library, and other import nothing, also don't import online said spark - assembly - * - hadoop *. Various jar jar package, but I setMaster (' local '), run there is no problem, but the argument to setMaster (' spark://1172.17.0.2:7077)), will appear the following error message:
18/10/13 09:15:54 WARN StandaloneAppClient $ClientEndpoint: Failed to connect to the master 172.17.0.2:7077
Org. Apache. Spark. SparkException: Exception thrown in awaitResult
Because my spark really run the docker environment, all address is 172.17.0.2
Bosses, please, exactly what went wrong?
CodePudding user response:
According to the reasonable volume should not be after you have finished jar package through scripts run?CodePudding user response:
Your spark started the cluster?CodePudding user response: