Home > Mobile >  Use Snowpark python to unload snowflake data to S3. How to provide storage integration option
Use Snowpark python to unload snowflake data to S3. How to provide storage integration option

Time:12-01

I am trying to unload snowflake data to S3, I have storage integration setup for the same. I could unload using SQL query, but wanted to do that using snowpark python.

DataFrameWriter.copy_into_location - this snowpark method does not have any parameter for storage_integration, which leaves me clue less on how to get this unload job done with snowpark!

Any help on this would be highly appreciated!

Tried using the existing copy_into_location method, with storage_integration='SI_NAME', which the internal SQL query thrown an error -

Invalid value ''SI_NAME'' for property 'STORAGE_INTEGRATION'. String literal identifier is unsupported for this property. Please use an unquoted or double-quoted identifier.

CodePudding user response:

You are right, DataFrameWriter.copy_into_location does not have the storage integration parameter.

You can create an external stage object pointing to your S3 location using your storage integration.

  create stage my_stage_s3
  storage_integration = my_storage_int
  url = 's3://mybucket/encrypted_files/'
  file_format = my_format;

Then, in your copy_into_location call, you specify the location as "@my_stage_s3/"

  • Related