Home > Mobile >  Spark (Decimal precision 1 exceeds max precision 0)
Spark (Decimal precision 1 exceeds max precision 0)

Time:09-22

I'm trying to get the count from a JDBC database with the following

def getCount(url: String, user: String, password : String, driver: String, table: String): BigDecimal={

    Context.getSparkSession().read.format("jdbc")
    .option("url", url)
    .option("user", user)
    .option("password", password)
    .option("dtable", s"(SELECT COUNT(*) FROM $table)")
    .load()
    .head.getDecimal(0)
}

However, in the code, at the line where I call getDecimal(0), I get the exception Decimal precision 1 exceeds max precision 0.

What I've tried to debug this:

  1. I've verified the credentials, URL, and table name, and I'm able to access the database and get the count with these parameters.

  2. I've verified it's not a database grant issue by printing the dataframe with

Context.getSparkSession().read.format("jdbc")
    .option("url", url)
    .option("user", user)
    .option("password", password)
    .option("dtable", s"(SELECT * FROM $table)")
    .load()

and I'm able to view the entire database table with no issues. I've also tried calling .count() on the dataframe and return the count that way but to no avail.

Does anybody know what this issue is or has seen something like this before? I'm stumped here. Yes I'm aware there are similar questions here but I've seen no resolutions, and no situations where the max precision is listed as 0, so this is why I'm asking.

CodePudding user response:

Figured it out. The JDBC URL started with jdbc:Oracle instead of jdbc:oracle, which fixed it. Still puzzled why a connection exception didn't occur, and I was able to print out the table but not do a count. Oh well, it's resolved.

  • Related