SoFunction
Updated on 2025-04-07

MySQL sql large file import correct posture

When importing large files in MySQL, mysql source imports large files too slowly and may encounter performance issues or memory limitations. The following are some suggestions for optimizing importing large files. Friends who need it can refer to it.

1. Use LOAD DATA INFILE

LOAD DATA INFILEIt is a method of quickly importing data provided by MySQL, compared to one by one.INSERTThe statement is much faster.

LOAD DATA INFILE '/path/to/your/'
INTO TABLE your_table
FIELDS TERMINATED BY ',' 
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS; -- If there is a title line,Can be ignored

2. Adjust MySQL configuration

If the file is very large, you may need to adjust the configuration parameters of MySQL:

  • max_allowed_packet: Increase the maximum allowable packet size.
  • innodb_buffer_pool_size: Increase the size of the InnoDB buffer pool to improve performance.
  • bulk_insert_buffer_size: Increase the size of the batch insert buffer.
SET GLOBAL max_allowed_packet = 1G;
SET GLOBAL innodb_buffer_pool_size = 2G;
SET GLOBAL bulk_insert_buffer_size = 256M;

3. Batch import

If the file is too large, you can split it into multiple small files and import it in batches.

4. Disable index and foreign key checking

Disabling indexing and foreign key checking can significantly improve performance when importing data.

ALTER TABLE your_table DISABLE KEYS;
SET FOREIGN_KEY_CHECKS = 0;

-- Import data

ALTER TABLE your_table ENABLE KEYS;
SET FOREIGN_KEY_CHECKS = 1;

5. Use the command line tool

Can be usedmysqlCommand line tools import data:

mysql -u username -p database_name < /path/to/your/

6. Use mysqldump and mysqlimport

For large SQL files, you can usemysqldumpExport the data and use it againmysqlimportImport.

mysqldump -u username -p database_name > 
mysql -u username -p database_name < 

7. Use third-party tools

There are some third-party tools that can help import large files, such aspt-archiverormydumper/myloader

8. Optimize the table structure

Ensure the table structure is reasonable and avoid unnecessary indexes and constraints to improve import speed.

9. Use transactions

Put multipleINSERTPutting statements in a transaction can reduce the number of commits and improve performance.

START TRANSACTION;

INSERT INTO your_table VALUES (...);
INSERT INTO your_table VALUES (...);
-- More INSERT Statement

COMMIT;

10. Monitoring and Adjustment

During the import process, monitor system resource usage (such as CPU, memory, disk I/O) and adjust configurations as needed.

Through these methods, large files can be imported into MySQL databases efficiently.

11. Solution

  • Turn off binary logs and transaction automatic commit
  • Manually start transactions and import data
  • Submit transactions
  • Restart binary logs and transaction automatic commit
set sql_log_bin=off;
set autocommit=0;
start transaction;
source document.sql;
commit;
set sql_log_bin=on;
set autocommit=1;

This is the article about the correct posture for importing large files for mysql sql. This is all about this article. For more related content for importing large files for mysql. Please search for my previous articles or continue browsing the related articles below. I hope everyone will support me in the future!