Home > other >  Using the idea development, spark program, setMaster (& # x27; Local & # x27) There is no problem, h
Using the idea development, spark program, setMaster (& # x27; Local & # x27) There is no problem, h

Time:10-12

Problem description:
I use maven created project, import a scala - SDK - 2.11.8 library, and other import nothing, also don't import online said spark - assembly - * - hadoop *. Various jar jar package, but I setMaster (' local '), run there is no problem, but the argument to setMaster (' spark://1172.17.0.2:7077)), will appear the following error message:
18/10/13 09:15:54 WARN StandaloneAppClient $ClientEndpoint: Failed to connect to the master 172.17.0.2:7077
Org. Apache. Spark. SparkException: Exception thrown in awaitResult

Because my spark really run the docker environment, all address is 172.17.0.2
Bosses, please, exactly what went wrong?

CodePudding user response:

According to the reasonable volume should not be after you have finished jar package through scripts run?

CodePudding user response:

Your spark started the cluster?

CodePudding user response:

reference 1st floor qq_41273754 response:
according to reasonable volume should not after you have finished jar package through scripts run?
if put cluster into the jar, there is no problem, but I use the right Idea directly operation more convenient, ha ha, old packaging problems

CodePudding user response:

refer to the second floor piduzi response:
your spark started the cluster?
uh-huh, spark cluster started

CodePudding user response:

I also not in eclipse, the meaning of the error is probably "in this way can't straight through sparkContext Settings, with the spark - submit" euphemism tell me if you want to set the in addition to "local" it does not support the other way, to other patterns with spark - submit to submit the jar package
  • Related