Home > Mobile >  Postgresql ignoring default value on bulk import
Postgresql ignoring default value on bulk import

Time:11-12

I have a postgres DB running in a docker container. I am trying to bulk import a csv full of postcodes. This is a Django project.

As they all refer to the same state - "NSW". That column doesn't exist in the CSV.

My Model (This is django) has the following definition for the State column:

state = models.CharField(max_length=200, default="NSW", null=False, blank=False)

But when I try and execute I get the following error:

postgres=# \COPY datacollector_localities(name, postcode) FROM '/postcodes-nsw.csv' DELIMITER ',' CSV HEADER;
ERROR:  null value in column "state" of relation "datacollector_localities" violates not-null constraint
DETAIL:  Failing row contains (5, Condobolin, 2877, null).
CONTEXT:  COPY datacollector_localities, line 2: "Condobolin,2877"

I assumed the leaving state out of the column_names list would make postgres use the default. What am I doing wrong?

CodePudding user response:

As stated in the doc, default values are applied when creating a instance of your model.

Similar to auto_now those are not handled with DB trigger because it would depend on the type of DB used.

I do not know how CSV import work (and you didn't share the code) but you'll need to do something like:

instances = []
for row in my_csv:
    instances.append(MyModel(**row.data))
MyModel.objects.bulk_create(instances)

Assuming row data is a dict with column label as key.

  • Related