Getting an issue when starting Spark on Kubernetes in client mode using JupyterHub.
21/10/05 03:54:33 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[Executor task launch worker-0,5,main]
java.lang.NoSuchMethodError: 'java.lang.String org.slf4j.helpers.Util.safeGetSystemProperty(java.lang.String)'
at org.slf4j.impl.VersionUtil.getJavaMajorVersion(VersionUtil.java:11)
at org.slf4j.impl.Log4jMDCAdapter.<clinit>(Log4jMDCAdapter.java:37)
at org.slf4j.impl.StaticMDCBinder.getMDCA(StaticMDCBinder.java:59)
at org.slf4j.MDC.<clinit>(MDC.java:74)
at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$setMDCForTask(Executor.scala:740)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:432)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
21/10/05 03:54:33 INFO MemoryStore: MemoryStore cleared
21/10/05 03:54:33 INFO BlockManager: BlockManager stopped
21/10/05 03:54:33 INFO ShutdownHookManager: Shutdown hook called
I've also confirmed that the slf4j versions and jars are the same in the driver and executor as some articles suggest that it could be a package version mismatch.
Jars installed are
jcl-over-slf4j-1.7.30.jar
jul-to-slf4j-1.7.30.jar
slf4j-api-1.7.30.jar
slf4j-log4j12-1.7.30.jar
in both executor and driver pods. Would you know if there's something I missed? Thank you.
CodePudding user response:
Was able to make it work. Built the executor pods from Spark's build tool and added new packages on top of that. Also ensured jars exist in the driver pod (Jupyter Notebook) and executor pods.