I have an array of records in my snowflake table like:
select * from dw.public.arr_table;
['val1'\t'val2'\t'val3'\t'val4'\t'val5'\t, 'val6'\t'val7'\t'val8'\t'val9'\t'val10'\t ]
['val1'\t'val2'\t'val3'\t'val4'\t'val5'\t, 'val6'\t'val7'\t'val8'\t'val9'\t'val10'\t ]
['val1'\t'val2'\t'val3'\t'val4'\t'val5'\t, 'val6'\t'val7'\t'val8'\t'val9'\t'val10'\t ]
['val1'\t'val2'\t'val3'\t'val4'\t'val5'\t, 'val6'\t'val7'\t'val8'\t'val9'\t'val10'\t ]
Each record is an array. How can I iterate through each array and select val1 & val6? I expect the result to be:
col1 col2 val1 val6
val1 & val6 are under column col1 and col2 respectively.
CodePudding user response:
The way you formatted your sample data doesn't make sense. I assume you want
select split(['val1\tval2\tval3\tval4\tval5\t', 'val6\tval7\tval8\tval9\tval10\t'][0],'\t')[0]::varchar as col1,
split(['val1\tval2\tval3\tval4\tval5\t', 'val6\tval7\tval8\tval9\tval10\t'][1],'\t')[0]::varchar as col2
CodePudding user response:
so if your data is "an string encoded array with tab separated values", then something like this CTE recreates your data:
with cte(col) as (
select parse_json(column1) from values
('["val1\tval2\tval3\tval4\tval5\t", "val6\tval7\tval8\tval9\tval10\t"]')
)
this gives us an array with two entries, if we always have two entries, we can access these can hard coded array indexes:
select
col
,col[0] as a0
,col[1] as a1
from cte
giving:
COL | A0 | A1 |
---|---|---|
[ "val1\tval2\tval3\tval4\tval5\t", "val6\tval7\tval8\tval9\tval10\t" ] | "val1\tval2\tval3\tval4\tval5\t" | "val6\tval7\tval8\tval9\tval10\t" |
we can then split those strings into there own array, but given you only wanted the first value we can use STRTOK as just take the first value:
select
col
,col[0] as a0
,col[1] as a1
,strtok(a0, '\t', 1) as a00
,strtok(a1, '\t', 1) as a10
from cte
COL | A0 | A1 | A00 | A10 |
---|---|---|---|---|
[ "val1\tval2\tval3\tval4\tval5\t", "val6\tval7\tval8\tval9\tval10\t" ] | "val1\tval2\tval3\tval4\tval5\t" | "val6\tval7\tval8\tval9\tval10\t" | val1 | val6 |
This feels like what you are asking for, if not, feel free to comment about to the gap in the problem/need.