Home > other >  Spark save and read Array[Byte] type
Spark save and read Array[Byte] type

Time:09-23

I have serialize the object to Array[Byte] type, and save it to the parquet file as StructField("byteArrayObject",ArrayType(ByteType), nullable = true). When I try to read it, using row.getAs[Array[Byte]]("byteArrayObject") there is an error:

scala.collection.mutable.WrappedArray$ofRef cannot be cast to [B

Any one know what the problem is?

CodePudding user response:

Spark deserializes array values as WrappedArray. Try the following:

import scala.collection.mutable.WrappedArray
import java.{lang => jl}

row
  .getAs[WrappedArray[jl.Byte]]("byteArrayObject")
  .map(_.byteValue)
  .array
  • Related