I am trying to COPY data (Variant data type - JSON) from Snowflake to S3. I am using below command:
copy into @STG_SF_S3_DEV_JJC/sample_file1.json
from (select distinct RECORD_CONTENT from MYTABLE where LOAD_DT_TS >= '2022-02-09 00:00:00')
FILE_FORMAT = (TYPE=JSON,COMPRESSION = NONE)
SINGLE=true
header=true
OVERWRITE = TRUE;
The data is getting copied. I am using STORAGE_INTEGRATION process & creating STAGE table. My question is:
- For above query, I am getting 23 rows and I want to load these 23 rows as 23 separate files (JSON files) in S3. How to achieve this?
- Each record has different LOAD_DT_TS and so for each 23 files, I need to give different names. I mean, the file name should end with LOAD_DT_TS. How to achieve this?
- MYTABLE in above query (COPY command), I have 4 columns. Is there any possible way to load all the 4 columns into a single JSON file?
Please share your thoughts.
CodePudding user response:
Please refer below (it covers point #1 and 2).
SNOWFLAKE1#COMPUTE_WH@TEST_DB.PUBLIC>select * from test_pivot;
--------- ----------- ----------- -----------
| COMPANY | BU | EVAL_DESC | MEMBER_ID |
|--------- ----------- ----------- -----------|
| C1 | FINANCIAL | L1 | ID1 |
| C1 | FINANCIAL | L2 | ID2 |
| C1 | FINANCIAL | L3 | ID3 |
| C1 | HR | L1 | ID4 |
| C1 | HR | L2 | ID5 |
| C2 | FINANCIAL | L1 | ID6 |
| C2 | BUSINESS | L1 | ID7 |
--------- ----------- ----------- -----------
7 Row(s) produced. Time Elapsed: 0.187s
SNOWFLAKE1#COMPUTE_WH@TEST_DB.PUBLIC>list @test_row_stage;
------ ------ ----- ---------------
| name | size | md5 | last_modified |
|------ ------ ----- ---------------|
------ ------ ----- ---------------
0 Row(s) produced. Time Elapsed: 0.177s
SNOWFLAKE1#COMPUTE_WH@TEST_DB.PUBLIC>EXECUTE IMMEDIATE $$
DECLARE
company varchar2(30);
BU varchar2(30);
eval_desc varchar2(30);
member_id varchar2(30);
file_name varchar2(30);
c1 CURSOR FOR SELECT * FROM test_pivot;
BEGIN
// OPEN c1;
for record in c1 do
company:=record.company;
BU:=record.BU;
eval_desc:=record.eval_desc;
member_id:=record.member_id;
file_name:='load'||'_'||member_id||'.csv';
create or replace temporary table temp_test_pvt(company varchar2(30),BU varchar2
(30),eval_desc varchar2(30),member_id varchar2(30));
insert into temp_test_pvt values (:company,:bu,:eval_desc,:member_id);
execute immediate 'copy into @test_row_stage/'||:file_name||' from (select * fro
m temp_test_pvt) overwrite=false';
end for;
RETURN 0;
END;
$$
;
-----------------
| anonymous block |
|-----------------|
| 0 |
-----------------
1 Row(s) produced. Time Elapsed: 9.803s
SNOWFLAKE1#COMPUTE_WH@TEST_DB.PUBLIC>list @test_row_stage;
------------------------------------------ ------ ---------------------------------- -------------------------------
| name | size | md5 | last_modified |
|------------------------------------------ ------ ---------------------------------- -------------------------------|
| test_row_stage/load_ID1.csv_0_0_0.csv.gz | 48 | df314a0d95a771d5d81fa9b1cfb3a28e | Thu, 17 Mar 2022 16:47:42 GMT |
| test_row_stage/load_ID2.csv_0_0_0.csv.gz | 48 | 6be39868046f583b1c63d616faa9e7f6 | Thu, 17 Mar 2022 16:47:43 GMT |
| test_row_stage/load_ID3.csv_0_0_0.csv.gz | 48 | ecf9dbcb4e45fa29d6bcfe268c6ccae3 | Thu, 17 Mar 2022 16:47:44 GMT |
| test_row_stage/load_ID4.csv_0_0_0.csv.gz | 48 | 0fd3ff7e0a453e04be3aca22147a7d32 | Thu, 17 Mar 2022 16:47:45 GMT |
| test_row_stage/load_ID5.csv_0_0_0.csv.gz | 48 | c77985f8312a540816d82b4bf4ec5249 | Thu, 17 Mar 2022 16:47:46 GMT |
| test_row_stage/load_ID6.csv_0_0_0.csv.gz | 48 | c9e9d6fd613a8bdb76413dd3e9464cc4 | Thu, 17 Mar 2022 16:47:47 GMT |
| test_row_stage/load_ID7.csv_0_0_0.csv.gz | 48 | 4e4b999ed56059b44ee6bd15f28cafb8 | Thu, 17 Mar 2022 16:47:48 GMT |
------------------------------------------ ------ ---------------------------------- -------------------------------
7 Row(s) produced. Time Elapsed: 0.176s
SNOWFLAKE1#COMPUTE_WH@TEST_DB.PUBLIC>select $1,$2,$3,$4 from @test_row_stage/load_ID1.csv_0_0_0.csv.gz;
---- ----------- ---- -----
| $1 | $2 | $3 | $4 |
|---- ----------- ---- -----|
| C1 | FINANCIAL | L1 | ID1 |
---- ----------- ---- -----
1 Row(s) produced. Time Elapsed: 0.429s
SNOWFLAKE1#COMPUTE_WH@TEST_DB.PUBLIC>select $1,$2,$3,$4 from @test_row_stage/load_ID2.csv_0_0_0.csv.gz;
---- ----------- ---- -----
| $1 | $2 | $3 | $4 |
|---- ----------- ---- -----|
| C1 | FINANCIAL | L2 | ID2 |
---- ----------- ---- -----
1 Row(s) produced. Time Elapsed: 0.374s
SNOWFLAKE1#COMPUTE_WH@TEST_DB.PUBLIC>select $1,$2,$3,$4 from @test_row_stage/load_ID3.csv_0_0_0.csv.gz;
---- ----------- ---- -----
| $1 | $2 | $3 | $4 |
|---- ----------- ---- -----|
| C1 | FINANCIAL | L3 | ID3 |
---- ----------- ---- -----
1 Row(s) produced. Time Elapsed: 0.506s
SNOWFLAKE1#COMPUTE_WH@TEST_DB.PUBLIC>select $1,$2,$3,$4 from @test_row_stage/load_ID4.csv_0_0_0.csv.gz;
---- ---- ---- -----
| $1 | $2 | $3 | $4 |
|---- ---- ---- -----|
| C1 | HR | L1 | ID4 |
---- ---- ---- -----
1 Row(s) produced. Time Elapsed: 0.281s
SNOWFLAKE1#COMPUTE_WH@TEST_DB.PUBLIC>
CodePudding user response:
This works for input a predicate value with quotes placement -
SNOWFLAKE1#COMPUTE_WH@TEST_DB.PUBLIC>EXECUTE IMMEDIATE $$
DECLARE
load_dt_tss timestamp;
file_name varchar2(30);
BEGIN
file_name:='somefile'||'.csv';
execute immediate 'copy into @test_row_stage/'||:file_name||' from (select LOAD_D
T_TS from TEST_TS where LOAD_DT_TS >=' || '''2022-02-09 00:00:00''' || ')' || ' FI
LE_FORMAT = (TYPE=CSV,COMPRESSION = NONE) overwrite=FALSE';
RETURN 0;
END;
$$
;
-----------------
| anonymous block |
|-----------------|
| 0 |
-----------------
1 Row(s) produced. Time Elapsed: 0.584s