Home > Mobile >  SQLite - skip the LAST line of a very large csv file while importing into a table
SQLite - skip the LAST line of a very large csv file while importing into a table

Time:04-14

In a csv received, the first line is the header with columns, and the last line is a custom trailer record that just notes the number of records, file names etc.

We would like to be able to load this large file into sqlite to eyeball some of the columns since excel struggles with this. Also, tools like UltraEdit can open this file, but it is a pain to check certain values in columns.

Can SQLite skip this last line, or should the last line be trimmed before we load into SQLite. We would prefer not to change this file.

Environment: Windows.

CodePudding user response:

You can use the SQLite CLI program to import your CSV. In its simplest form:

sqlite3 -separator ; test.db ".import test.csv test_table"

and with a test.csv like this:

A;B;C
a;b;c
d;e;f
g;h;i
summary line

the import will complete, and only produce a warning:

test.csv:5: expected 3 columns but found 1 - filling the rest with NULL

The result is this test_table:

A B C
a b c
d e f
g h i
summary NULL NULL

It would be easy to eliminate the last record with a DELETE statement.


The .import dot-command will interpret its argument as a filename by default, but it will treat it as a shell command when it starts with a pipe character |.

This means you can pre-process the CSV file to eliminate the summary line, for example with findstr -v:

sqlite3 -separator ; test.db ".import '|findstr -v summary < test.csv' test_table"

You could even make a multi-line init script comprising dot commands and SQL, which sets up your database and massages your input data into shape. This is more flexible than cramming it all onto the command line.

sqlite3 -init import_script.txt test.db
  • Related