I am trying to get a sequential number table from 1 to 20 million. (or 0 to 20 million)
I am rather awestruck at how difficult it\'s been to get a MySQL-compatible s
Adpoting psadac's answer of using LOAD DATA INFILE
and the idea of BULK insertion applied to fwrite:
$fh = fopen("data_num.txt", 'a') or die("can't open file");
$i =1;
while($i <= 20000000) {
$num_string .= "$i\n";
if($i % 1000000 == 0) {
fwrite($fh, $num_string);
$num_string = "";
}
$i +=1;
}
fclose($fh);
$dbh->beginTransaction();
$query = "LOAD DATA INFILE '" . addslashes(realpath("data_num.txt")) . "' INTO TABLE numbers LINES TERMINATED BY '\n';";
$sth = $dbh->prepare($query);
$sth->execute();
$dbh->commit();
unlink("data_num.txt");
I had to use addslashes as I am using windows enviorment.
It's interesting to note that doing the BULK technique by writing only 20 times to file over 20 million resulted in ~10 seconds compared to ~75 seconds by just writing 20 million times. Using string concatenation over pushing values into an array and imploding yielded almost twice as fast.