How-ToJanuary 10, 20267 min read

How to Import CSV Files into MySQL: 5 Practical Approaches

Five proven ways to load CSVs into MySQL—from fast server-side imports to client-side commands, UI wizards, and production-ready scripts—plus where automation fits.

Igor Nikolic
Igor Nikolic

Co-founder, FileFeed

How to Import CSV Files into MySQL: 5 Practical Approaches

MySQL is widely used for OLTP workloads, but CSV imports can get tricky: delimiters differ, encodings break, headers are inconsistent, and types misalign. Here are five practical ways to import CSVs into MySQL, from quick commands to repeatable pipelines.

1) LOAD DATA INFILE (server-side)

Fastest bulk option when the CSV is on the database server and you control file permissions.

  • Best when: file sits on the DB server, you have server access, you need maximum throughput.

LOAD DATA INFILE '/var/lib/mysql/import/users.csv'
INTO TABLE users
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(email, first_name, last_name);

Note

MySQL must be allowed to read the file (check `secure_file_priv`). Paths and permissions matter.

2) LOAD DATA LOCAL INFILE (client-side)

Use when the CSV is on your laptop/CI and you prefer to stream from the client without server file access.

  • Best when: local/CI imports, no server file placement, occasional/manual runs.

LOAD DATA LOCAL INFILE '/Users/igor/Downloads/users.csv'
INTO TABLE users
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(email, first_name, last_name);

Enable `local_infile=1` on both client and server if disabled; mind client library flags in CI.

3) MySQL Workbench Import Wizard (UI)

Workbench provides a guided CSV import: pick the table, set delimiters/encoding, map columns, and preview before loading.

  • Best when: non-technical teammates need a UI, small/medium imports, one-off tasks.
  • Becomes slow on very large files and not ideal for repeatable automation.

4) mysqlimport CLI

A wrapper over LOAD DATA that reads a CSV into a table matching the file name (or specified). Useful for quick CLI flows.

mysqlimport \
  --local \
  --fields-terminated-by="," \
  --fields-enclosed-by='"' \
  --lines-terminated-by='\n' \
  --ignore-lines=1 \
  --user=dbuser --password --host=localhost \
  app users.csv

Great for quick loads; use `--columns` if file columns differ from table order.

5) Custom Import Script (staging or Python)

For recurring or user-facing imports, add validation, type casting, logging, and retries. Two common patterns:

Option A: Staging Table + SQL

  • LOAD DATA (INFILE/LOCAL) into a staging table (all TEXT).
  • Validate/transform/cast into the final table with INSERT...SELECT.
  • Bad rows stay inspectable; one bad value does not block the batch.

Option B: Python + mysql-connector

pip install mysql-connector-python
import csv
import mysql.connector

conn = mysql.connector.connect(
    user="app", password="secret", host="localhost", database="app"
)

with conn.cursor() as cur, open("users.csv", newline="", encoding="utf-8") as f:
    reader = csv.DictReader(f)
    rows = []
    for row in reader:
        rows.append((row["email"], row["first_name"], row["last_name"]))
    cur.executemany(
        """
        INSERT INTO users (email, first_name, last_name)
        VALUES (%s, %s, %s)
        """,
        rows,
    )
conn.commit()

  • Best when: imports recur, validation/transform rules matter, you need logging/retries/error reporting.

Choosing the Right Approach

  • One-time on server: LOAD DATA INFILE.
  • One-time from local/CI: LOAD DATA LOCAL INFILE or mysqlimport --local.
  • Manual UI: MySQL Workbench.
  • Developer UI: DBeaver/DataGrip.
  • Recurring/production: staging tables or custom scripts/pipelines.

Where FileFeed Fits

When CSV imports become part of your product or onboarding, every customer sends a slightly different file, columns drift, validation rules grow, retries and audit logs are needed, and engineers become the bottleneck. FileFeed lets you define validation, mapping, and transformations once, reuse them across customers and environments, and deliver clean, consistent MySQL data without bespoke glue code.

Final Thoughts

MySQL offers solid primitives for CSV import. If imports are rare, built-in tools are enough. For recurring, user-driven, or business-critical flows, a structured pipeline pays off. FileFeed removes the repetitive CSV handling so your team can keep MySQL imports reliable without rebuilding the same logic each time.

Stay Updated

Subscribe to our newsletter and get the latest insights on secure file transfers, automation, and best practices.

Schedule

Ready to automate your file workflows?

Tell us how you exchange files today, and we’ll show you how to replace manual uploads and scripts with a single, automated pipeline.