Home > Software design >  What is the most efficient way to parse JSON fields in PL/SQL?
What is the most efficient way to parse JSON fields in PL/SQL?

Time:04-30

Imagine the table my_table with the rows:

COLUMN_1 COLUMN_2 COLUMN_3
"test_1212" {date: 1646240118, name: "John", age: null} "test_2311"
"test_998" null "test_26351"
"test_56551" {age: 20} "test_3323"

Here, COLUMN_2 is of JSON or null. The JSON fields could have a value, be null, or not exist.

I'm writing a query that unpacks the JSON field if it exists and has a value. Below is what I have and I'm curious if having multiple case statement could affect the performance and if there is a better way to do this?

Note that above is just an example and the actual JSON could have up to 10 fields. Optionally I can unpack the JSON on the code side instead of SQL if it improves the performance.

SELECT COLUMN_1,
       CASE WHEN json_value(COLUMN_2, '$.date') IS NOT NULL THEN json_value(COLUMN_2, '$.date')
           END AS date,
       CASE WHEN json_value(COLUMN_2, '$.name') IS NOT NULL THEN json_value(COLUMN_2, '$.name')
           END AS name,
       CASE WHEN json_value(COLUMN_2, '$.age') IS NOT NULL THEN json_value(COLUMN_2, '$.age')
           END AS age,
       COLUMN_3
FROM my_table

I am using Oracle version 12.2.0.1.0 (12c).

CodePudding user response:

You can use JSON_TABLE to parse field values -

SQL> with data_cte (col1, col2, col3) as
  2  (
  3  select 'COLUMN_1',json_object('date' value 1646240118, 'name' value 'John', 'age' value null), 'COLUMN_3' from dual union all
  4  select 'COLUMN_1', null, 'COLUMN_3'  from dual union all
  5  select 'COLUMN_1', json_object('date' value null, 'name' value 'Tom', 'age' value 20), 'COLUMN_3' from dual
  6  )select col1, t.*, col3 from data_cte, json_table(col2,'$'
  7  columns (
  8  date_1 varchar2(10) path '$.date',
  9  name varchar2(21) path '$.name',
 10  age varchar2(10) path '$.age')
 11  ) t;

COL1     DATE_1       NAME       AGE        COL3
-------- ------------ ---------- ---------- --------
COLUMN_1 1646240118   John                  COLUMN_3
COLUMN_1              Tom        20         COLUMN_3

CodePudding user response:

You can use a JSON_TABLE with OUTER APPLY:

SELECT column_1,
       column_3,
       j.*
FROM   my_table m
       OUTER APPLY JSON_TABLE(
         m.column_2,
         '$'
         COLUMNS (
           dt   NUMBER       PATH '$.date',
           name VARCHAR2(50) PATH '$.name',
           age  NUMBER       PATH '$.age'
         )
       ) j;

Or, if you prefer to use an ANSI standard LATERAL join rather than the proprietary OUTER APPLY then:

SELECT column_1,
       column_3,
       j.*
FROM   my_table m
       LEFT OUTER JOIN LATERAL (
         SELECT *
         FROM   JSON_TABLE(
                  m.column_2,
                  '$'
                  COLUMNS (
                    dt   NUMBER       PATH '$.date',
                    name VARCHAR2(50) PATH '$.name',
                    age  NUMBER       PATH '$.age'
                  )
                )
       ) j
       ON (1 = 1);

Which, for the sample data:

CREATE TABLE my_table (COLUMN_1, COLUMN_2, COLUMN_3) AS
SELECT 'test_1212',  '{date: 1646240118, name: "John", age: null}', 'test_2311' FROM DUAL UNION ALL
SELECT 'test_998',   null, 'test_26351' FROM DUAL UNION ALL
SELECT 'test_56551', '{age: 20}', 'test_3323' FROM DUAL;

Both output:

COLUMN_1 COLUMN_3 DT NAME AGE
test_1212 test_2311 1646240118 John null
test_998 test_26351 null null null
test_56551 test_3323 null null 20

db<>fiddle here

  • Related