Home > database >  Pass json file into JAR or read from spark session
Pass json file into JAR or read from spark session

Time:03-14

I have a Spark UDF written on Scala. I'd like to use my function with some additional files.

import scala.io.Source
import org.json4s.jackson.JsonMethods.parse
import org.json4s.DefaultFormats

object consts {
    implicit val formats = DefaultFormats
    val my_map = parse(Source.fromFile("src/main/resources/map.json").mkString).extract[Map[String, Map[String, List[String]]]]
}

Now I want to use my_map object inside UDF. So I basically do this:

import package.consts.my_map

object myUDFs{
 *bla-bla and use my_map*
}

I've already tested my function in a local, so it works well. Now I want to understand how to pack a jar file so that .json file stays there?

Thank you.

CodePudding user response:

If you manage your project with Maven, you can place your .json file(s) under src/main/resources as it's the default place where Maven looks for your project's resources.

You also can define a custom path for your resources as described here: https://maven.apache.org/plugins/maven-resources-plugin/examples/resource-directory.html

CodePudding user response:

UPD: I managed to do so by creating fatJar and reading my resource file this way:

    Source
      .fromInputStream(
        getClass.getClassLoader.getResourceAsStream("map.json")
      )
      .mkString
  ).extract[Map[String, Map[String, List[String]]]]
  • Related