Home > other >  Java how to submit the application to spark standalone cluster to run
Java how to submit the application to spark standalone cluster to run

Time:09-24

Application submitted to spark a cluster usually do

CodePudding user response:

${SPARK_HOME}/bin/spark - submit -- class [the main method of class name] - deploy - mode of client/cluster - master spark://[masterIP: port] [jar package path] [parameter]
See the official document

CodePudding user response:

This submission, is not very convenient; How does the inside the application submitted?

CodePudding user response:

Like this:
String TMP="/home/hadoop/spark - spring - 0.0.1 - the SNAPSHOT. Jar";
Args=new String [] {
"Master", "spark://master: 7077",
"-- the deploy - mode", "cluster",
"-- the name", "test Java submit jobs to spark",
"-- the class", "br.com.spark.JavaWordCount,"
"-- executor - the memory", "512 m",
TMP
};
SparkSubmit. Main (args);
But I don't have to run, this is in the local development environment I submit;

CodePudding user response:


I'm sorry, because of my personal ability is limited, can't help you,




CodePudding user response:

https://github.com/spark-jobserver/spark-jobserver

Submit your Java spark job through REST API.

CodePudding user response:

Submitted in application, like by org. Apache. Spark. The launcher. SparkLauncher classes, into the jar package, using the Java command application submitted to spark, specific you can Google to see
  • Related