Home > Software design >  How to read Snowflake table from Spark connector when the table was created with quotes around it?
How to read Snowflake table from Spark connector when the table was created with quotes around it?

Time:01-21

I know how to read from Snowflake table with Spark connector like below:

df = spark.read.format("snowflake") \
               .options(**sfParams) # This is a dict with all the SF creds stored \
               .option('dbtable', 'TABLE1').load()

This works perfectly fine. But if the table was created with quotes around it in Snowflake like CREATE TABLE DB1.SCHEMA1."MY.TABLE2", spark is not able to format it. I tried like

df = spark.read.format("snowflake") \
               .options(**sfParams) # This is a dict with all the SF creds stored \
               .option('dbtable', '"MY.TABLE2"').load()

But its throwing invalid URL prefix found in: 'MY.TABLE2'

CodePudding user response:

When using objects with identifiers in code, they need to be escaped, like:

'"MY.TABLE2"'

should be:

'\"MY.TABLE2\"'
  • Related