Tutorial November 19, 2025 • 12 min read

How to Import CSV to Database: MySQL, PostgreSQL & MongoDB Guide

Complete step-by-step tutorials for importing CSV files into the most popular databases, with troubleshooting tips and best practices.

Importing CSV files into databases is one of the most common data migration tasks for developers and data analysts. Whether you're moving customer data from Excel into MySQL, loading analytics data into PostgreSQL, or importing product catalogs into MongoDB, understanding the right approach for your database saves time and prevents errors.

This guide covers the three most popular database systems—MySQL, PostgreSQL, and MongoDB—with practical examples, command syntax, and troubleshooting for common import errors. We'll also cover essential CSV data cleaning steps you should take before importing to ensure data integrity.

Table of Contents

1

Preparing Your CSV File for Import

Clean data before importing prevents 90% of common database errors

Before importing any CSV file into a database, you must clean and validate your data. Importing dirty data leads to failed imports, corrupted records, and database errors that are harder to fix after the fact. Use our CSV validator to check for errors before importing.

Essential Pre-Import Checklist

Validate column structure

Ensure every row has the same number of columns as your header. Use a CSV validator to detect structural issues.

Remove duplicates

Identify and remove duplicate rows before import to prevent unique key violations. Remove duplicates online.

Standardize date formats

Convert all dates to a consistent format (e.g., YYYY-MM-DD for MySQL). Fix date formats automatically.

Clean whitespace and special characters

Remove leading/trailing spaces, tabs, and hidden characters that can cause import failures. Clean CSV files online.

Verify UTF-8 encoding

Save your CSV with UTF-8 encoding to handle international characters correctly (ä, ñ, 中文, etc.).

⚠️ Pro Tip

Always test imports with a small sample (first 100-1000 rows) before running the full import. This catches errors early and saves time on large datasets.

2

Import CSV to MySQL

Using LOAD DATA INFILE for fast bulk imports

MySQL provides the LOAD DATA INFILE command for efficient CSV imports. This is significantly faster than INSERT statements for large datasets.

Step 1: Create Your Table

First, create a table that matches your CSV structure:

CREATE TABLE customers (
    id INT AUTO_INCREMENT PRIMARY KEY,
    email VARCHAR(255) NOT NULL,
    first_name VARCHAR(100),
    last_name VARCHAR(100),
    signup_date DATE,
    country VARCHAR(2),
    revenue DECIMAL(10,2)
);

Step 2: Import CSV with LOAD DATA INFILE

LOAD DATA INFILE '/path/to/customers.csv'
INTO TABLE customers
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(email, first_name, last_name, signup_date, country, revenue);

Important Parameters Explained

Parameter Description
FIELDS TERMINATED BY Column delimiter (usually comma , or tab \t)
ENCLOSED BY Text qualifier (usually double quote ")
LINES TERMINATED BY Row delimiter (\n for Unix, \r\n for Windows)
IGNORE 1 ROWS Skip header row (adjust number if you have multiple header rows)

💡 Using LOCAL for Security

If you get a permission error, use LOAD DATA LOCAL INFILE to load from the client machine instead of the server:

mysql --local-infile=1 -u username -p database_name
3

Import CSV to PostgreSQL

Using COPY command for high-performance imports

PostgreSQL's COPY command is the fastest way to bulk-import CSV data. It's optimized for large datasets and handles complex data types well.

Step 1: Create Your Table

CREATE TABLE customers (
    id SERIAL PRIMARY KEY,
    email VARCHAR(255) NOT NULL,
    first_name VARCHAR(100),
    last_name VARCHAR(100),
    signup_date DATE,
    country CHAR(2),
    revenue NUMERIC(10,2)
);

Step 2: Import CSV with COPY

COPY customers(email, first_name, last_name, signup_date, country, revenue)
FROM '/path/to/customers.csv'
DELIMITER ','
CSV HEADER;

Alternative: Use \copy (psql command) to import from your local machine:

\copy customers(email, first_name, last_name, signup_date, country, revenue) FROM 'customers.csv' DELIMITER ',' CSV HEADER;

Handling NULL Values

Specify how NULL values are represented in your CSV:

COPY customers FROM '/path/to/customers.csv'
DELIMITER ','
CSV HEADER
NULL 'NULL';  -- Replace 'NULL' with empty string or your NULL marker

✅ Performance Tip

For very large imports (millions of rows), disable indexes before importing and rebuild them after:

DROP INDEX idx_customer_email;
-- Run your COPY command
CREATE INDEX idx_customer_email ON customers(email);
4

Import CSV to MongoDB

Using mongoimport for document-based imports

MongoDB stores data as documents (JSON-like), but mongoimport can convert CSV rows into documents automatically. Each CSV row becomes a MongoDB document.

Basic CSV Import

mongoimport --db myDatabase --collection customers \
  --type csv --headerline \
  --file customers.csv

Import with Authentication

mongoimport --uri "mongodb://username:password@localhost:27017/myDatabase" \
  --collection customers \
  --type csv --headerline \
  --file customers.csv

Specify Field Types

MongoDB auto-detects types, but you can specify them explicitly:

mongoimport --db myDatabase --collection customers \
  --type csv \
  --fields "email.string(),first_name.string(),signup_date.date(2006-01-02),revenue.double()" \
  --file customers.csv

Upsert Mode (Update or Insert)

Update existing documents instead of creating duplicates:

mongoimport --db myDatabase --collection customers \
  --type csv --headerline \
  --upsert --upsertFields email \
  --file customers.csv

⚠️ Date Format Note

MongoDB expects ISO 8601 format for dates (YYYY-MM-DD or YYYY-MM-DDTHH:MM:SS). Make sure your CSV dates are standardized before import.

5

Common Import Errors & Solutions

Fix the most frequent database import issues

❌ "Data too long for column"

ERROR 1406: Data too long for column 'email' at row 47

Cause: Your data exceeds the column size defined in the table schema.

Solution:

  • Increase column size: ALTER TABLE customers MODIFY email VARCHAR(500);
  • Or trim data in CSV to fit existing column constraints

❌ "Duplicate entry for key 'PRIMARY'"

ERROR 1062: Duplicate entry 'john@example.com' for key 'email_unique'

Cause: Your CSV contains duplicate values for a column with UNIQUE constraint.

Solution:

❌ "Incorrect date value"

ERROR 1292: Incorrect date value: '12/31/2025' for column 'signup_date'

Cause: Date format in CSV doesn't match database expectations (usually YYYY-MM-DD).

Solution:

❌ "File not found" or Permission Denied

ERROR 29: File '/tmp/customers.csv' not found

Cause: Database server cannot access the file path (permission or path issue).

Solution:

  • MySQL: Use LOAD DATA LOCAL INFILE and enable --local-infile=1
  • PostgreSQL: Use \copy instead of COPY
  • Check file permissions: chmod 644 customers.csv

❌ "Invalid UTF-8 character"

ERROR: invalid byte sequence for encoding "UTF8"

Cause: CSV file has mixed or incorrect character encoding.

Solution:

  • Convert CSV to UTF-8: iconv -f ISO-8859-1 -t UTF-8 input.csv > output.csv
  • In Excel: Save As → CSV UTF-8 (not just "CSV")
  • Clean your CSV file to remove problematic characters
6

Best Practices for Database Imports

Follow these guidelines for reliable, fast imports

🧪 Test with Sample Data First

Import the first 100-1000 rows as a test. Catch errors early before processing millions of records.

💾 Backup Before Import

Always backup your database before large imports. Use mysqldump or pg_dump.

📊 Monitor Import Progress

For large files, check row count during import: SELECT COUNT(*) FROM table;

🔒 Use Transactions

Wrap imports in transactions so you can rollback if something goes wrong: BEGIN; ... COMMIT;

Disable Indexes Temporarily

For massive imports (millions of rows), drop indexes before importing and recreate after for 3-5x faster performance.

Validate After Import

Check row count, NULL values, and data types after import: SELECT * FROM table LIMIT 10;

Complete Import Workflow

  1. 1 Clean and validate your CSV file (validate structure, remove duplicates and whitespace)
  2. 2 Create database table with matching schema and appropriate data types
  3. 3 Test import with first 100-1000 rows to catch errors early
  4. 4 Backup database before full import
  5. 5 Run full import using appropriate command (LOAD DATA INFILE, COPY, mongoimport)
  6. 6 Validate results (row count, sample data, NULL values)

Ready to Import Your CSV?

Before importing to your database, make sure your CSV file is clean and validated. Use neatcsv to prepare your data in seconds. Plans from 9€/month.

Related Articles