Home > other >  Spark - shell run command: Java. Lang. OutOfMemoryError: Java heap space
Spark - shell run command: Java. Lang. OutOfMemoryError: Java heap space

Time:10-01

The following is the standard output: excuse me, how should solve?
14/12/03 10:46:53 ERROR executor. Executor: Exception in task ID 1
Java. Lang. OutOfMemoryError: Java heap space
The at Java. IO. ObjectInputStream $HandleTable. Turns (ObjectInputStream. Java: 3437)
The at Java. IO. ObjectInputStream $HandleTable. Assign (ObjectInputStream. Java: 3244)
The at Java. IO. ObjectInputStream. ReadOrdinaryObject (ObjectInputStream. Java: 1762)
The at Java. IO. ObjectInputStream. ReadObject0 (ObjectInputStream. Java: 1347)
The at Java. IO. ObjectInputStream. DefaultReadFields (ObjectInputStream. Java: 1964)
The at Java. IO. ObjectInputStream. ReadSerialData (ObjectInputStream. Java: 1888)
The at Java. IO. ObjectInputStream. ReadOrdinaryObject (ObjectInputStream. Java: 1771)
The at Java. IO. ObjectInputStream. ReadObject0 (ObjectInputStream. Java: 1347)
The at Java. IO. ObjectInputStream. ReadObject (ObjectInputStream. Java: 369)
The at org. Apache. Spark. Serializer. JavaDeserializationStream. ReadObject (JavaSerializer. Scala: 63)
The at org. Apache. Spark. Util. Collections. ExternalAppendOnlyMap $DiskMapIterator. ReadNextItem (ExternalAppendOnlyMap. Scala: 384)
The at org. Apache. Spark. Util. Collections. ExternalAppendOnlyMap $DiskMapIterator. HasNext (ExternalAppendOnlyMap. Scala: 402)
At the scala. Collection. The Iterator $$$1. -anon hasNext (847) the Iterator. Scala:
At org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.org $apache $$util spark $$$$$getMorePairs ExternalIterator ExternalAppendOnlyMap collection (256). ExternalAppendOnlyMap scala:
The at org. Apache. Spark. Util. Collections. ExternalAppendOnlyMap $$$$next anonfun ExternalIterator $1. Apply (ExternalAppendOnlyMap. Scala: 312)
The at org. Apache. Spark. Util. Collections. ExternalAppendOnlyMap $$$$next anonfun ExternalIterator $1. Apply (ExternalAppendOnlyMap. Scala: 310)
At the scala. Collection. The mutable. ResizableArray $class. Foreach (ResizableArray. Scala: 59)
At the scala. Collection. Mutable. ArrayBuffer. Foreach (ArrayBuffer. Scala: 47)
The at org. Apache. Spark. Util. Collections. ExternalAppendOnlyMap $ExternalIterator. Next (ExternalAppendOnlyMap. Scala: 310)
The at org. Apache. Spark. Util. Collections. ExternalAppendOnlyMap $ExternalIterator. Next (ExternalAppendOnlyMap. Scala: 226)
At the scala. Collection. The Iterator $class. Foreach (Iterator. Scala: 727)
The at org. Apache. Spark. Util. Collections. ExternalAppendOnlyMap $ExternalIterator. Foreach (ExternalAppendOnlyMap. Scala: 226)
The at org. Apache. Spark. Shuffle. Hash. HashShuffleWriter. Write (HashShuffleWriter. Scala: 57)
The at org. Apache. Spark. The scheduler. ShuffleMapTask. RunTask (ShuffleMapTask. Scala: 147)
The at org. Apache. Spark. The scheduler. ShuffleMapTask. RunTask (ShuffleMapTask. Scala: 97)
The at org. Apache. Spark. The scheduler. Task. The run (Task. Scala: 51)
The at org. Apache. Spark. Executor. $TaskRunner executor. Run (executor. Scala: 187)
The at Java. Util. Concurrent. ThreadPoolExecutor. RunWorker (ThreadPoolExecutor. Java: 1110)
The at Java. Util. Concurrent. ThreadPoolExecutor $Worker. The run (ThreadPoolExecutor. Java: 603)
The at Java. Lang. Thread. The run (Thread. Java: 722)

CodePudding user response:

Try to modify the spark - env. XML SPARK_DAEMON_JAVA_OPTS option in the value of the
For example,
- Xmx16g - Xms16g - Xmn256m - XX: + UseParNewGC - XX: XX: + UseConcMarkSweepGC CMSInitiatingOccupancyFraction=70 - XX: ParallelGCThreads=10 - verbose: gc - XX: XX: + PrintGCDetails - + PrintGCTimeStamps Xloggc:/home/hadoop/software/spark/spark - 1.1.0 - bin - hadoop2.4/logs
  • Related