Home > database >  Building scala libraries and using them in databricks
Building scala libraries and using them in databricks

Time:10-07

I have fair knowledge of scala and i use it in databricks for my data engineering needs. I want to create some customer libraries that i can use in all other notebooks. Here is what i'm looking for

  1. create a scala notebook helperfunctions.scala which will have functions like ParseUrl(), GetUrl() etc

  2. Deploy these libraries on databricks cluster

  3. Call these libraries from another notebook using 'import from helperfunctions as fn' and use the functions

Can you give me an idea about how to get started? What does databricks offer?

CodePudding user response:

I'd suggest not using notebooks as imports.

You can compile and package your functions as a JAR from plain JVM code using your preferred tooling, then upload it to something like JitPack or GitHub Packages, which you can then import your utilities as a Maven reference like other Spark dependencies

  • Related