Method 1: Insert one by one, the performance is the worst, it is not recommended to use it
INSERT INTO Test(Id, Name) VALUES(newid(), '1'); INSERT INTO Test(Id, Name) VALUES(newid(), '2');
C# method
static void InsertOne() { ("Implemented by inserting one by one"); Stopwatch sw = new Stopwatch(); using (SqlConnection conn = new SqlConnection(StrConnMsg)) //Using will automatically connect Open and Close. { string sql = "INSERT INTO Test(Id,Name) VALUES(newid(),@p)"; (); for (int i = 0; i < totalRow; i++) { using (SqlCommand cmd = new SqlCommand(sql, conn)) { ("@p", i); (); (); (("Insert a record,Time consuming{0}millisecond", )); } if (i == getRow) { (); break; } } } (("insert{0}Record,Every{4}条的insert时间是{1}millisecond,预估总得insert时间是{2}millisecond,{3}minute", totalRow, , (( / getRow) * totalRow), GetMinute(( / getRow * totalRow)), getRow)); } static int GetMinute(long l) { return (Int32)l / 60000; }
in conclusion
It is estimated that 50 minutes to insert 100w records, and each record is about 3 milliseconds.
Method 2: Splicing SQL
INSERT INTO Test(Id, Name) VALUES(newid(), '1'),(newid(), '2')
C# method
static void InsertFour() { ("Implemented by splicing batch SQL insertion"); Stopwatch sw = new Stopwatch(); using (SqlConnection conn = new SqlConnection(StrConnMsg)) //Using will automatically connect Open and Close. { (); (); for (int j = 0; j < totalRow / getRow;j++ ) { StringBuilder sb = new StringBuilder(); ("INSERT INTO Product(Id,Name,Price) VALUES"); using (SqlCommand cmd = new SqlCommand()) { for (int i = 0; i < getRow; i++) { ("(newid(),'{0}'),", i); } = conn; = ().TrimEnd(','); (); } } (); (("insert{0}Record,Time spent in total{1}millisecond",totalRow,)); } }
in conclusion
Insert 100w records, it is expected to take 10 minutes.
Method 3: Use Bulk
BULK INSERT [ database_name . [ schema_name ] . | schema_name . ] [ table_name | view_name ] FROM 'data_file' [ WITH ( [ [ , ] BATCHSIZE = batch_size ] --BATCHSIZEDirective to set the number of records that can be inserted into a table in a single transaction [ [ , ] CHECK_CONSTRAINTS ] --Specifies during a large-capacity import operation,All constraints on the target table or view must be checked。If not CHECK_CONSTRAINTS Options,Then all CHECK and FOREIGN KEY All constraints will be ignored,And after this operation the constraints of the table will be marked as untrusted。 [ [ , ] CODEPAGE = { 'ACP' | 'OEM' | 'RAW' | 'code_page' } ] --Specify the code page for the data in this data file [ [ , ] DATAFILETYPE = { 'char' | 'native'| 'widechar' | 'widenative' } ] --Specify BULK INSERT 使用Specify的数据文件类型值执行导入操作。 [ [ , ] FIELDTERMINATOR = 'field_terminator' ] --Symbols that identify contents [ [ , ] FIRSTROW = first_row ] --Specify要加载的第一行的行号。默认值是Specify数据文件中的第一行 [ [ , ] FIRE_TRIGGERS ] --Whether to start the trigger [ [ , ] FORMATFILE = 'format_file_path' ] [ [ , ] KEEPIDENTITY ] --Specify导入数据文件中的标识值用于标识列 [ [ , ] KEEPNULLS ] --Specifies during a large-capacity import operation空列应保留一个空值,Without inserting any default value for the column [ [ , ] KILOBYTES_PER_BATCH = kilobytes_per_batch ] [ [ , ] LASTROW = last_row ] --Specify要加载的最后一行的行号 [ [ , ] MAXERRORS = max_errors ] --Specify允许在数据中出现的最多语法错误数,After exceeding this number, the large-capacity import operation will be cancelled.。 [ [ , ] ORDER ( { column [ ASC | DESC ] } [ ,...n ] ) ] --Specify数据文件中的数据如何排序 [ [ , ] ROWS_PER_BATCH = rows_per_batch ] [ [ , ] ROWTERMINATOR = 'row_terminator' ] --Symbols identifying separate lines [ [ , ] TABLOCK ] --Specify为大容量导入操作持续时间获取一个表级锁 [ [ , ] ERRORFILE = 'file_name' ] --Specify用于收集格式有误且不能转换为 OLE DB Line files of row set。 )]
C# method
static void InsertTwo() { ("How to implement using Bulk insertion"); Stopwatch sw = new Stopwatch(); DataTable dt = GetTableSchema(); using (SqlConnection conn = new SqlConnection(StrConnMsg)) { SqlBulkCopy bulkCopy = new SqlBulkCopy(conn); = "Product"; = ; (); (); for (int i = 0; i < totalRow;i++ ) { DataRow dr = (); dr[0] = (); dr[1] = ("commodity", i); dr[2] = (decimal)i; (dr); } if (dt != null && != 0) { (dt); (); } (("insert{0}A total of records spent{1}millisecond,{2}minute", totalRow, , GetMinute())); } } static DataTable GetTableSchema() { DataTable dt = new DataTable(); (new DataColumn[] { new DataColumn("Id",typeof(Guid)), new DataColumn("Name",typeof(string)) }); return dt; }
in conclusion
Insert 100w records, it is expected to take more than 8 seconds.
Method 4: Use TVPs to insert data
Create a cache table
--Create Table Valued CREATE TYPE TestTemp AS TABLE (Id int, Name nvarchar(32))
c# method
static void TbaleValuedToDB(DataTable dt) { Stopwatch sw = new Stopwatch(); SqlConnection sqlconn = new SqlConnection("server=.;database=TestDB;user=sa;password=123456;"); const string TSqlStatement = "insert into Test (Id,Name)" + " SELECT , " + " FROM @TestTvp AS tt"; SqlCommand cmd = new SqlCommand(TSqlStatement, sqlconn); SqlParameter catParam = ("@TestTvp", dt); = ; = ""; try { (); if (dt != null && != 0) { (); } } catch (Exception ex) { ("error>" + ); } finally { (); } } static void TVPsInsert() { ("Insert data using TVPs for short"); Stopwatch sw = new Stopwatch(); for (int i = 0; i < 10; i++) { DataTable dt = GetTableSchema(); for (int j = i * 100; j < (i + 1) * 100; j++) { DataRow r = (); r[0] = j; r[1] = ("{0}", i * j); (r); } (); TbaleValuedToDB(dt); (); (("Elapsed Time is {0} Milliseconds", )); } (); }
in conclusion
Insert 100w records, it is expected to take more than 11s
The above is a detailed explanation of the method of batch inserting data into SQLserver in C#. For more information about inserting data into SQLserver in C#, please pay attention to my other related articles!