Home > database >  Jooq dsl for batch insert of maps, arrays and so forth
Jooq dsl for batch insert of maps, arrays and so forth

Time:05-13

Im hoping to use jooq dsl to do batch inserts to postgres. I know it's possible but Im having issues getting the data formatted properly.

dslContext.loadInto(table).loadJSON(json-data).fields(...).execute();

is where Im starting from. The tricky part seems to be getting Map<String, String> into a jsonb column.

I have the data formatted according to this description and jooq seems to be ok with it.. until the map/json-in-json shows up.

Another json-array column still needs to be dealt with too.

Questions:

  1. is this a reasonable approach?
  2. if not - what would you recommend instead?

Error(s) Im seeing:

ERROR: column "changes_to" is of type jsonb but expression is of type character varying
Hint: You will need to rewrite or cast the expression.

Edit:

    try (DSLContext context = DSL.using(pgClient.getDataSource(), SQLDialect.POSTGRES_10)) {
        context.loadInto(table(RECORD_TABLE))
                .loadJSON(jsonData)
                .fields(field(name(RECORD_ID_COLUMN)),
                        field(name(OTHER_ID_COLUMN)),
                        field(name(CHANGES_TO_COLUMN)),
                        field(name(TYPE_COLUMN)),
                        IDS_FIELD)
                .execute();
    } catch (IOException e) {
        throw new RuntimeException(e);
    }

with json data:

{"fields":[{"name":"rec_id","type":"VARCHAR"},{"name":"other_id","type":"VARCHAR"},{"name":"changes_to","type":"jsonb"},{"name":"en_type","type":"VARCHAR"},{"name":"ids","type":"BIGINT[]"}],"records":[["recid","crmid","{\"key0\":\"val0\"}","ent type",[10,11,12]],["recid2","crmid2","{\"key0\":\"val0\"}","ent type2",[10,11,12]]]}

The problem(s) being how to format the 'changes_to' and 'ids' columns.

CodePudding user response:

There's a certain price to pay if you're not using jOOQ's code generator (and you should!) jOOQ doesn't know what of data type your columns are if you create a field(name("...")), so it won't be able to bind your values correctly. Granted, the Loader API could read the JSON header information, but it currently doesn't.

Instead, why not just either:

  • Provide explicit type information to your column references, like field(name(CHANGES_TO_COLUMN), SQLDataType.JSONB)
  • Much better: use the code generator, in case of which you already have all the type information associated with your Field expression.
  • Related