Home > other >  Use collect functions in Java. Lang. ClassCastException: always assign the instance of
Use collect functions in Java. Lang. ClassCastException: always assign the instance of

Time:09-24

I am a CSV file input, each line below
HX332780 14/7/5, OTHER OFFENSE, PROBATION VIOLATION, PARKING LOT/GARAGE (NON) RESID)), Y, N, 1113
HX332854 14/7/5, OTHER OFFENSE, HARASSMENT BY TELEPHONE, APARTMENT, N, N, 1533
HX332743 14/7/5, CRIMINAL DAMAGE, TO VEHICLE, STREET, N, N, 1021
HX332735 14/7/5, THEFT, AND UNDER $500, RESTAURANT, N, N, 1014
.
.
The following is a simple processing code
The object SparkPi {
Def main (args: Array [String]) {
Val conf=new SparkConf (.) setAppName (" Spark Pi "). SetMaster (" Spark://Master: 7077 "). The setJars (List ("/home/hadoop/Downloads/JetBrains IntelliJ. Xdowns/idea - IU - 139.1117.1/Spark - examples - 1.5.2 - hadoop2.6.0. Jar "))
Val sc=new SparkContext (conf)
Val rawData=https://bbs.csdn.net/topics/sc.textFile ("/home/hadoop/123. CSV ")
Val secondData=https://bbs.csdn.net/topics/rawData.map (_. The split (", "). TakeRight (4) the head)
Val thirdData=https://bbs.csdn.net/topics/secondData.map (n=> (n, 1)). ReduceByKey (+ _ _). Collect ()
Sc. Stop ()
}
}
After clustering was implemented the following mistakes
15/12/09 22:11:09 WARN TaskSetManager: Lost task in stage 1.0 0.0 (dar 1, 219.216.65.129) : Java. Lang. ClassCastException: always assign the instance of org. Apache. Spark. Examples. SparkPi $$anonfun $2 to the field org. Apache. Spark. RDD. RDD $$$flatMap anonfun $1 $$$$4. Apply anonfun cleanF $2 of type scala. Once Function1 in the instance of org. Apache. Spark. RDD. RDD $$$flatMap anonfun $1 $$$$4 apply anonfun
.
.

Where is the great god is wrong? Remove the collect is not an error, I just want to statistical data in each row from bottom of the fourth column of the frequencies of different words...

CodePudding user response:

Friend, can you tell me the problem you solved

CodePudding user response:

Is said to be the 2.0.1 and Scala version compatibility, with 2.0.0, there is no problem,,,
  • Related