Home > Mobile >  Why is a task and stage numbers are decimal numbers - Apache Spark
Why is a task and stage numbers are decimal numbers - Apache Spark

Time:03-01

I'm trying to understand an error I get while using spark (EMR). In the stderr of the step there is:

TaskSetManager: Lost task 1435.6 in stage 152.0 (TID 102906) on ip-172-11-47-9.ec2.internal, executor 203: org.apache.hadoop.fs.FileAlreadyExistsException (File already exists:s3://<path>) [duplicate 5]

You can see the task and the stage numbers are decimal numbers and right to the dot there isn't always a zero. I can find it a lot, this is just an example. what does the number right to the dot indicates?

CodePudding user response:

The second number is the task attempt number, starting from 0.

  • Related