Home > Enterprise >  Spark-operator on EKS Apache spark failed to create temp directory
Spark-operator on EKS Apache spark failed to create temp directory

Time:10-07

I am trying to deploy simple spark-pi.yaml onto AWS EKS using spark-operator. I have successfully deployed spark-operator

Refer deployment YAML here spark-operator example

I am getting the following error when I do helm install

Events:
  Type     Reason                            Age   From            Message
  ----     ------                            ----  ----            -------
  Normal   SparkApplicationAdded             8s    spark-operator  SparkApplication spark-pi was added, enqueuing it for submission
  Warning  SparkApplicationSubmissionFailed  5s    spark-operator  failed to submit SparkApplication spark-pi: failed to run spark-submit for SparkApplication spark-operator/spark-pi: WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/opt/spark/jars/spark-unsafe_2.12-3.1.1.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Exception in thread "main" java.io.IOException: Failed to create a temp directory (under /tmp) after 10 attempts!
  at org.apache.spark.util.Utils$.createDirectory(Utils.scala:305)
  at org.apache.spark.util.Utils$.createTempDir(Utils.scala:325)
  at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:343)
  at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
  at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
  at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
  at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
  at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1030)
  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1039)
  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

How can I resolve this issue?

CodePudding user response:

Try update the spark app (spark-pi.yaml) spec (line 35) and re-submit:

volumes:
- name: "test-volume"
  emptyDir: {}

CodePudding user response:

This is going to be hard to debug, but based on my experience, a couple things could be going on here -

  1. I see your executor doesn't have it's service account defined. You may want to explicitly define that
  2. There may not be enough space in your volume to create the /tmp directory. You may want to double check your volume size
  • Related