Home > other >  Spark project running, tip can't delete C:/userPaht AppData files under/local/Temp/
Spark project running, tip can't delete C:/userPaht AppData files under/local/Temp/

Time:09-18

17/02/17 09:04:50 ERROR ShutdownHookManager: Exception while deleting the Spark temp dir: C: \ U
Sers \ tend \ AppData \ Local \ Temp \ fc4 spark - 70484-167 - d - 48 fa - a8f6-54 db9752402e \ userFiles - 27 a65cc7
58967-817 - f - 4476 - a2a2 - d7b6cc1
Java. IO. IOException: Failed to delete: C: \ Users \ tend \ AppData \ Local \ Temp \ fc4 spark - 70484-167
D - 48 fa - a8f6-54 db9752402e \ userFiles - 27 a65cc7-817 - f - 4476 - d7b6cc1 a2a2-58967
The at org. Apache. Spark. Util. Utils $. DeleteRecursively (Utils. Scala: 1010)
The at org. Apache. Spark. Util. ShutdownHookManager $$anonfun $1 $$$apply anonfun $$sp MCV $3. Ap
The ply (ShutdownHookManager. Scala: 65)
The at org. Apache. Spark. Util. ShutdownHookManager $$anonfun $1 $$$apply anonfun $$sp MCV $3. Ap
The ply (ShutdownHookManager. Scala: 62)
At the scala. Collection. IndexedSeqOptimized $class. Foreach (IndexedSeqOptimized. Scala: 33
)
At the scala. Collection. The mutable. ArrayOps $ofRef. Foreach (ArrayOps. Scala: 186)
The at org. Apache. Spark. Util. ShutdownHookManager $$anonfun $1. Apply $MCV $sp (ShutdownHookM
Anager. Scala: 62)
At org. Apache. Spark. Util. SparkShutdownHook. Run (216). ShutdownHookManager scala:
The at org. Apache. Spark. Util. SparkShutdownHookManager $$$runAll anonfun $1 $$$apply anonfun
$$sp MCV $1. Apply $MCV $sp (ShutdownHookManager. Scala: 188)
The at org. Apache. Spark. Util. SparkShutdownHookManager $$$runAll anonfun $1 $$$apply anonfun
$$sp MCV $1. Apply (ShutdownHookManager. Scala: 188)
The at org. Apache. Spark. Util. SparkShutdownHookManager $$$runAll anonfun $1 $$$apply anonfun
$$sp MCV $1. Apply (ShutdownHookManager. Scala: 188)
The at org. Apache. Spark. Util. Utils $. LogUncaughtExceptions (Utils. Scala: 1951)
The at org. Apache. Spark. Util. SparkShutdownHookManager $$$runAll anonfun $1. Apply $MCV $sp (S
HutdownHookManager. Scala: 188)
The at org. Apache. Spark. Util. SparkShutdownHookManager $$$runAll anonfun $1. Apply (Shutdown
HookManager. Scala: 188)
The at org. Apache. Spark. Util. SparkShutdownHookManager $$$runAll anonfun $1. Apply (Shutdown
HookManager. Scala: 188)
At the scala. Util. Try $. Apply (192). The Try scala:
The at org. Apache. Spark. Util. SparkShutdownHookManager. RunAll (ShutdownHookManager. Scala
: 188)
The at org. Apache. Spark. Util. SparkShutdownHookManager $$$2. -anon run (ShutdownHookManager.
Scala: 178)
At org, apache hadoop. Util. ShutdownHookManager $1. The run (ShutdownHookManager. Java: 54)

CodePudding user response:

The spark under the Windows system have this problem, don't want to see, the log4j. The properties of the log level set to FATAL.

CodePudding user response:

The original poster is solved this problem?

CodePudding user response:

This is inevitable,

CodePudding user response:

According to the address at, did not delete, need to manually delete,,,

CodePudding user response:

You can try to access to the data cache ().

CodePudding user response:

Must support
  • Related