Home > other >  Spark - submit to run the jar package, clew cannot create the work directory, how to solve?
Spark - submit to run the jar package, clew cannot create the work directory, how to solve?

Time:10-07

I use the local jar package can be normal operation, but the cluster would have failed
 
[hadoop @ server1 sbin] $./start - master. Sh
Starting org. Apache. Spark. Deploy. Master, master, logging to/home/hadoop/local/spark - 3.0.0/spark - 3.0.0 - bin - hadoop3.2/logs/spark - hadoop - org. Apache. Spark. Deploy. Master. The master - 1 - server1. Out
[hadoop @ server1 sbin] $./start - slaves. Sh spark://server1:7077
Server2: starting org. Apache. Spark. Deploy. Worker. The worker, logging to/home/hadoop/local/spark - 3.0.0/spark - 3.0.0 - bin - hadoop3.2/logs/spark - hadoop - org. Apache. Spark. Deploy. Worker. The worker - 1 - server2. Out
[hadoop @ server1 sbin] $spark - submit - name testSpark -- class org. UserData. Clean - master spark://server1:7077 - deploy - mode cluster/home/hadoop/test/test_spark - 1.0 - the SNAPSHOT - jar - with - dependencies. Jar
The 2020-07-25 15:34:14, 828 ERROR deploy. ClientEndpoint: Exception from cluster was: Java. IO. IOException: Failed to create the directory/home/hadoop/local/spark - 3.0.0/spark - 3.0.0 - bin - hadoop3.2/work/driver - 20200725153409-0000
Java. IO. IOException: Failed to create the directory/home/hadoop/local/spark - 3.0.0/spark - 3.0.0 - bin - hadoop3.2/work/driver - 20200725153409-0000
The at org. Apache. Spark. Deploy. Worker. DriverRunner. CreateWorkingDirectory (DriverRunner. Scala: 145)
The at org. Apache. Spark. Deploy. Worker. DriverRunner. PrepareAndRunDriver (DriverRunner. Scala: 176)
At org. Apache. Spark. Deploy. Worker. DriverRunner $$$2. -anon run (96). DriverRunner scala:
[hadoop @ server1 sbin] $


To work the folder permissions is 777, users the same user

CodePudding user response:

A specified path in the code
  • Related