How to automatically import CSV files into postgres

Automatically import CSV files into PostgreSQL with a simple manual trigger. This workflow reads a CSV file, converts it into a spreadsheet format, and uploads the data to a specified PostgreSQL table, streamlining data management and reducing manual entry errors.

7/4/2025
4 nodes
Simple
manualsimplereadbinaryfilespreadsheetfilepostgresqlfilesstoragedatabasedata
Categories:
Data Processing & AnalysisManual TriggeredSimple Workflow
Integrations:
ReadBinaryFileSpreadsheetFilePostgreSQL

Target Audience

  • Data Analysts: Individuals who need to regularly import CSV data into PostgreSQL databases for analysis.
    - Developers: Those who want to automate data import processes to save time and reduce errors in manual data entry.
    - Business Intelligence Professionals: Users who require a seamless integration of data from CSV files into their reporting tools.
    - Database Administrators: Professionals looking for efficient methods to manage and import large datasets into PostgreSQL.
  • Problem Solved

    This workflow addresses the challenge of manually importing CSV files into PostgreSQL databases. It automates the process, minimizing human error and saving time, especially when dealing with large datasets. The workflow enables users to effortlessly read data from CSV files, convert it into a suitable format, and upload it directly into the database, ensuring data integrity and accuracy.

    Workflow Steps

  • Step 1: Manual Trigger - The workflow begins when the user clicks the 'execute' button, initiating the process.
    - Step 2: Read From File - The workflow reads the CSV file located at /tmp/t1.csv, converting its content into a binary format compatible with further processing.
    - Step 3: Convert To Spreadsheet - The binary data is then transformed into a spreadsheet format, making it easier to handle and prepare for database insertion.
    - Step 4: Postgres - Finally, the processed data is inserted into the specified PostgreSQL table (t1) within the public schema, utilizing automatic mapping for the id and name columns. The connection to the PostgreSQL database is managed through predefined credentials to ensure secure access.
  • Customization Guide

  • Changing File Path: Users can modify the filePath parameter in the 'Read From File' node to point to a different CSV file as needed.
    - Adjusting Database Table: The table parameter in the 'Postgres' node can be updated to import data into a different table, ensuring compatibility with user-specific database structures.
    - Modifying Columns: Users can customize the columns mapping in the 'Postgres' node to include additional fields or change data types according to their database schema requirements.
    - Adding More Nodes: For more complex workflows, users can integrate additional nodes to perform data validation, transformation, or notifications after the import process.
  • How to automatically import CSV files into postgres - N8N Workflow Directory