I am trying to get a sequential number table from 1 to 20 million. (or 0 to 20 million)
I am rather awestruck at how difficult it\'s been to get a MySQL-compatible s
Adpoting psadac's answer of using LOAD DATA INFILE
and the idea of BULK insertion applied to fwrite:
$fh = fopen("data_num.txt", 'a') or die("can't open file");
$i =1;
while($i <= 20000000) {
$num_string .= "$i\n";
if($i % 1000000 == 0) {
fwrite($fh, $num_string);
$num_string = "";
}
$i +=1;
}
fclose($fh);
$dbh->beginTransaction();
$query = "LOAD DATA INFILE '" . addslashes(realpath("data_num.txt")) . "' INTO TABLE numbers LINES TERMINATED BY '\n';";
$sth = $dbh->prepare($query);
$sth->execute();
$dbh->commit();
unlink("data_num.txt");
I had to use addslashes as I am using windows enviorment.
It's interesting to note that doing the BULK technique by writing only 20 times to file over 20 million resulted in ~10 seconds compared to ~75 seconds by just writing 20 million times. Using string concatenation over pushing values into an array and imploding yielded almost twice as fast.
In response to Devon Bernard's answer, I decided to approach it using PDO Mysql PHP and use the concept of just a few queries. At first I tried to do it with just 1 big query but PHP ran out of memory with default settings, so I decided to tweak to run every 100,000th. Even if you allocate enough memory to hold, there is no significant improvement.
$i = 1;
$inserts = array();
while($i <= 20000000) {
$inserts[] = "($i)";
if($i % 100000 == 0) {
$dbh->beginTransaction();
$query = "INSERT INTO numbers(i) VALUES " . implode(',', $inserts) . ";";
$sth = $dbh->prepare($query);
$sth->execute();
$dbh->commit();
$inserts = array();
}
$i +=1;
}