Home > OS >  Scala Spark: How to convert an Array[(String,String,String)] to Map[String,Map[String,String]]
Scala Spark: How to convert an Array[(String,String,String)] to Map[String,Map[String,String]]

Time:06-24

I would like to convert an array of (String,String,String) to Map[String,Map[String,String]].

Example :

Input

val inputArray: Array[(String,String,String)] = Array(("elem1","elem2","elem3"),("elem4","elem5","elem6"))

Desired output

val outputMap: Map[String, Map[String, String]] = Map(
 ("elem1", Map("elem2" -> "elem3")),
 ("elem4", Map("elem5" -> "elem6"))
)

CodePudding user response:

First of all, Map("elem2" -> "elem3") is not a good choice in this case, tuples would be better as far as I can reason about (if it holds only on key-value as it is in your question). Anyway, you need to iterate over the array, and update a state of type Map[String, Map[String, String]], just like this:

inputArray.foldLeft(Map.empty[String, Map[String, String]]) {
    case (updatingMap, (elem, key, value)) => 
      updatingMap.updated(elem, Map(key -> value))
  }

Also, the question is not about spark right now, unless you explicitly declare where it relates to spark or you may want to solve your issue using spark apis over a dataframe.

  • Related