It can be done with Postgres, just not with
Use a temporary staging table like this:
CREATE TEMP TABLE target_tmp AS
TABLE target_tbl LIMIT 0; -- create temp table with same columns as target table
COPY target_tmp FROM '/absolute/path/to/file' (FORMAT csv);
INSERT INTO target_tbl
OFFSET 8; -- start with line 9
DROP TABLE target_tmp; -- optional, else it's dropped at end of session automatically
The skipped rows must be valid, too.
Obviously, this is more expensive - which should not matter much with small to medium tables. Matters with big tables. Then you really should trim the surplus rows in the input file before importing.
Make sure your
temp_buffers setting is big enough to hold the temp table to minimize the performance penalty.
Related (with instructions for
\copy without superuser privileges):
- How to update selected rows with values from a CSV file in Postgres?