DatasetDataFrame=sparkSession. SQL (" select * from the product ");
DataFrame. ToJavaRDD (.) foreachPartition (new VoidFunction() {
private static final long serialVersionUID=1L;
@ Override
Public void call (IteratorRows) throws the Exception {
Try {
System. The out. Print (" hello world ");
} the catch (Exception e) {
}
}
});
Perform error:
18/11/13 21:31:18 ERROR yarn. ApplicationMaster: User class threw the exception: org. Apache. Spark. SparkException: Task not serializable
Org. Apache. Spark. SparkException: Task not serializable
The at org. Apache. Spark. Util. ClosureCleaner $. EnsureSerializable (ClosureCleaner. Scala: 298)
The at $$$$ClosureCleaner util spark apache org.apache.spark.util.ClosureCleaner$.org $$clean (ClosureCleaner. Scala: 288)
The at org. Apache. Spark. Util. ClosureCleaner $. The clean (ClosureCleaner. Scala: 108)
The at org. Apache. Spark. SparkContext. Clean (SparkContext. Scala: 2101)
The at org. Apache. Spark. RDD. RDD $$$foreachPartition anonfun $1. Apply (RDD. Scala: 925)
The at org. Apache. Spark. RDD. RDD $$$foreachPartition anonfun $1. Apply (RDD. Scala: 924)
The at org. Apache. Spark. RDD. RDDOperationScope $. WithScope (RDDOperationScope. Scala: 151)
The at org. Apache. Spark. RDD. RDDOperationScope $. WithScope (RDDOperationScope. Scala: 112)
The at org. Apache. Spark. RDD. RDD. WithScope (RDD. Scala: 362)
The at org. Apache. Spark. RDD. RDD. ForeachPartition (RDD. Scala: 924)
The at org. Apache. Spark. API. Java. JavaRDDLike $class. ForeachPartition (JavaRDDLike. Scala: 219)
The at org. Apache. Spark. API. Java. AbstractJavaRDDLike. ForeachPartition (JavaRDDLike. Scala: 45)
Inside the function of the reference to other posts, I without reference to any other external variables, why would have this kind of error, mystery, to obtain a great god give directions
CodePudding user response:
The site you look at yourself, https://blog.csdn.net/achilles12345/article/details/77778431