Home > Enterprise >  Convert column containing values as List to Array
Convert column containing values as List to Array

Time:12-04

I have a spark dataframe as below:

 ------------------------------------------------------------------------ 
|                              domains                                   |
 ------------------------------------------------------------------------ 
|["0b3642ab5be98c852890aff03b3f83d8","4d7a5a24426749f3f17dee69e13194a9", |
| "9d0f74269019ad82ae82cc7a7f2b5d1b","0b113db8e20b2985d879a7aaa43cecf6", |
| "d095db19bd909c1deb26e0a902d5ad92","f038deb6ade0f800dfcd3138d82ae9a9", |
| "ab192f73b9db26ec2aca2b776c4398d2","ff9cf0599ae553d227e3f1078957a5d3", |
| "aa717380213450746a656fe4ff4e4072","f3346928db1c6be0682eb9307e2edf38", |
| "806a006b5e0d220c2cf714789828ecf7","9f6f8502e71c325f2a6f332a76d4bebf", |
| "c0cb38016fb603e89b160e921eced896","56ad547c6292c92773963d6e6e7d5e39"] |
 ------------------------------------------------------------------------ 

It contains column as list. I want to convert into Array[String]. eg:

Array("0b3642ab5be98c852890aff03b3f83d8","4d7a5a24426749f3f17dee69e13194a9", "9d0f74269019ad82ae82cc7a7f2b5d1b","0b113db8e20b2985d879a7aaa43cecf6", "d095db19bd909c1deb26e0a902d5ad92","f038deb6ade0f800dfcd3138d82ae9a9", 
"ab192f73b9db26ec2aca2b776c4398d2","ff9cf0599ae553d227e3f1078957a5d3",
"aa717380213450746a656fe4ff4e4072","f3346928db1c6be0682eb9307e2edf38",
"806a006b5e0d220c2cf714789828ecf7","9f6f8502e71c325f2a6f332a76d4bebf",
"c0cb38016fb603e89b160e921eced896","56ad547c6292c92773963d6e6e7d5e39")

I tried the following code but I am not getting intended results:

DF.select("domains").as[String].collect()

Instead I get this:

[Ljava.lang.String;@7535f28 ...

Any ideas how can I achieve this ?

CodePudding user response:

You can first explode your domains column before collecting it, as follows:

import org.apache.spark.sql.functions.{col, explode}

val result: Array[String] = DF.select(explode(col("domains"))).as[String].collect()

You can then print your result array using mkString method:

println(result.mkString("[", ", ", "]"))

CodePudding user response:

Here you are getting the Array[String] only as expected. [Ljava.lang.String;@7535f28 --> this is a kind of type descriptor we use internally in byte code. [ represents an array and Ljava.lang.String represents the Class java.lang.String.

If you want to print the array values as a string, you can use .mkString() function.

import spark.implicits._

val data = Seq((Seq("0b3642ab5be98c852890aff03b3f83d8","4d7a5a24426749f3f17dee69e13194a9", "9d0f74269019ad82ae82cc7a7f2b5d1b","0b113db8e20b2985d879a7aaa43cecf6", "d095db19bd909c1deb26e0a902d5ad92","f038deb6ade0f800dfcd3138d82ae9a9")))

val df = spark.sparkContext.parallelize(data).toDF("domains")
// df: org.apache.spark.sql.DataFrame = [domains: array<string>]

val array_values = df.select("domains").as[String].collect()
// array_values: Array[String] = Array([0b3642ab5be98c852890aff03b3f83d8, 4d7a5a24426749f3f17dee69e13194a9, 9d0f74269019ad82ae82cc7a7f2b5d1b, 0b113db8e20b2985d879a7aaa43cecf6, d095db19bd909c1deb26e0a902d5ad92, f038deb6ade0f800dfcd3138d82ae9a9])

val string_value = array_values.mkString(",")

print(string_value)
// [0b3642ab5be98c852890aff03b3f83d8, 4d7a5a24426749f3f17dee69e13194a9, 9d0f74269019ad82ae82cc7a7f2b5d1b, 0b113db8e20b2985d879a7aaa43cecf6, d095db19bd909c1deb26e0a902d5ad92, f038deb6ade0f800dfcd3138d82ae9a9]

This if you create normal arrays also, can see the same.

scala> val array_values : Array[String] = Array("value1", "value2")
array_values: Array[String] = Array(value1, value2)

scala> print(array_values)
[Ljava.lang.String;@70bf2681

scala> array_values.foreach(println)
value1
value2
  • Related