Home Import csv file beginning on specific line number

# Import csv file beginning on specific line number

otterdog2000
1#
otterdog2000 Published in 2018-01-12 21:37:28Z
 I want to import a csv file into a table beginning on line 9 of the csv file. How do I specify this condition in postgresql? The first 8 lines have a bunch of irrelevant text describing the data below. This is a screenshot of the file imported into Excel. And this is the table in my db I am trying to insert the data into. CREATE TABLE trader.weather ( station text NOT NULL, "timestamp" timestamp with time zone NOT NULL, temp numeric(6,2), wind numeric(6,2) ) 
Dan
2#
 It can't be done on PostgreSQL, you should do it with an external tool or process before postgres. According to the manual, the only processes you can do to a CSV are mostly QUOTE or NULL related: COPY table_name [ ( column_name [, ...] ) ] FROM { 'filename' | STDIN } [ [ WITH ] [ BINARY ] [ OIDS ] [ DELIMITER [ AS ] 'delimiter' ] [ NULL [ AS ] 'null string' ] [ CSV [ HEADER ] [ QUOTE [ AS ] 'quote' ] [ ESCAPE [ AS ] 'escape' ] [ FORCE NOT NULL column_name [, ...] ] ] ] COPY { table_name [ ( column_name [, ...] ) ] | ( query ) } TO { 'filename' | STDOUT } [ [ WITH ] [ BINARY ] [ OIDS ] [ DELIMITER [ AS ] 'delimiter' ] [ NULL [ AS ] 'null string' ] [ CSV [ HEADER ] [ QUOTE [ AS ] 'quote' ] [ ESCAPE [ AS ] 'escape' ] [ FORCE QUOTE { column_name [, ...] | * } ] ] ]  There are many ways to alter a CSV automatically before using it in PostgreSQL, you should check other options.
 It can be done with Postgres, just not with COPY directly. Use a temporary staging table like this: CREATE TEMP TABLE target_tmp AS TABLE target_tbl LIMIT 0; -- create temp table with same columns as target table COPY target_tmp FROM '/absolute/path/to/file' (FORMAT csv); INSERT INTO target_tbl TABLE target_tmp OFFSET 8; -- start with line 9 DROP TABLE target_tmp; -- optional, else it's dropped at end of session automatically  The skipped rows must be valid, too. Obviously, this is more expensive - which should not matter much with small to medium tables. Matters with big tables. Then you really should trim the surplus rows in the input file before importing. Make sure your temp_buffers setting is big enough to hold the temp table to minimize the performance penalty. Related (with instructions for \copy without superuser privileges): How to update selected rows with values from a CSV file in Postgres?