Home > Mobile >  Spark fail if not all resources are allocated
Spark fail if not all resources are allocated

Time:03-26

Does spark or yarn has any flag to fail fast job if we can't allocate all resoucres?

For example if i run

spark-submit   --class org.apache.spark.examples.SparkPi   
   --master yarn-client  
   --num-executors 7   
   --driver-memory 512m   
   --executor-memory 4g   
   --executor-cores 1   
/usr/hdp/current/spark2-client/examples/jars/spark-examples_*.jar 1000

For now if spark can allocate only 5 executors it just will go with 5. Can we make to run it only with 7 or fail in other case?

CodePudding user response:

You can set a spark.dynamicAllocation.minExecutors config in your job. For it you need to set spark.dynamicAllocation.enabled=true, detailed in this doc

  • Related