If I have such a table, I want to insert a lot of data into it
1 CREATE TABLE IF NOT EXISTS `user_info` ( 2 `id` int(11) NOT NULL AUTO_INCREMENT COMMENT 'Self increasing primary key', 3 `name` varchar(255) NOT NULL default '' COMMENT 'Full name', 4 `age` int(11) NOT NULL default '0' COMMENT 'Age', 5 PRIMARY KEY (`id`) 6 ) ENGINE=InnoDB DEFAULT CHARSET=utf8 COMMENT='User information table';
Batch insertion
Method 1. Use for loop insertion
When inserting a small amount of data into mysql, we usually use the for loop
1 $arr = [ 2 [ 3 'name' => 'testname1', 4 'age' => 18, 5 ], 6 [ 7 'name' => 'testname2', 8 'age' => 19, 9 ], 10 [ 11 'name' => 'testname3', 12 'age' => 18, 13 ], 14 ]; 15 16 $servername = "localhost"; 17 $port = 3306; 18 $username = "username"; 19 $password = "password"; 20 $dbname = "mytestdb"; 21 22 // Create connection 23 $conn = new mysqli($servername, $username, $password, $dbname, $port); 24 25 // Detection connection 26 if ($conn->connect_error) { 27 die("connect failed: " . $conn->connect_error); 28 } 29 30 $costBegin = microtime(true); 31 32 foreach($arr as $item) { 33 $sql = sprintf("INSERT INTO user_info (name, age) VALUES ( '%s', %d);", $item['name'], (int)$item['age']); 34 if ($conn->query($sql) === TRUE) { 35 echo "insert success"; 36 } else { 37 echo "Error: " . $sql . "<br>" . $conn->error; 38 } 39 } 40 41 $costEnd = microtime(true); 42 $cost = round($costEnd - $costBegin, 3); 43 var_dump($cost); 44 45 $conn->close();
If we want to insert a large amount of data in batches, if we still use the for loop method to insert it, there is no problem, but it will take a long time. Compare inserting a small amount of data with inserting a large amount of data, and use the above for loop to insert: number of pieces of time (unit: seconds)
Method 2: use the insert statement to merge the inserts
You can use the insert statement to merge and insert in mysql, for example
1 INSERT INTO user_info (name, age) VALUES ('name1', 18), ('name2', 19);Indicates two pieces of data are inserted at a time 2 3 $arr = [ 4 [ 5 'name' => 'testname1', 6 'age' => 18, 7 ], 8 [ 9 'name' => 'testname2', 10 'age' => 19, 11 ], 12 [ 13 'name' => 'testname3', 14 'age' => 18, 15 ], 16 // Omit here 17 ...... 18 ...... 19 ]; 20 21 $servername = "localhost"; 22 $port = 3306; 23 $username = "username"; 24 $password = "password"; 25 $dbname = "mytestdb"; 26 27 // Create connection 28 $conn = new mysqli($servername, $username, $password, $dbname, $port); 29 30 // Detection connection 31 if ($conn->connect_error) { 32 die("connect failed: " . $conn->connect_error); 33 } 34 35 $costBegin = microtime(true); 36 37 if (!empty($arr)) { 38 $sql = sprintf("INSERT INTO user_info (name, age) VALUES "); 39 40 foreach($arr as $item) { 41 $itemStr = '( '; 42 $itemStr .= sprintf("'%s', %d", $item['name'], (int)$item['age']); 43 $itemStr .= '),'; 44 $sql .= $itemStr; 45 } 46 47 // Remove the last comma and add the ending semicolon 48 $sql = rtrim($sql, ','); 49 $sql .= ';'; 50 51 if ($conn->query($sql) === TRUE) { 52 } else { 53 echo "Error: " . $sql . "<br>" . $conn->error; 54 } 55 } 56 57 $costEnd = microtime(true); 58 $cost = round($costEnd - $costBegin, 3); 59 var_dump($cost); 60 61 $conn->close();
Let's look at the time comparison between a small amount of data and a large amount of data. From the overall time, it can be seen that the Insert Merge insert saves a lot of time than the just for loop insert, and the effect is very obvious (unit: Second)
If you think the array is too large, and you want to reduce the risk of sql errors, you can also use array? Chunk to cut the array into blocks of the specified size, and then insert and merge each block