I have tried 3 different approaches to inserting multiple (over 500) rows into a SQLite table. Counter-intuitively, approach 3 below is the fastest. I would have assumed approach 2 would be the fastest because it uses a "prepared" statement handle. But my third approach - inserting 500 rows at a time (500 is apparently the max allowed in SQLite) - is way faster.
Am I missing anything? Should I continue with the 500 approach, or is there another way?
note: the below code is not my actual code, I just wrote it here for example purposes and has not been tested.
use strict;
use warnings;
use DBI;
my $dsn = "DBI:SQLite:dbname=db";
my $dbh = DBI->connect($dsn,"","");
open my $data_file,"<","data.txt"; # 3 integer fields per line
APPROACH_1:
while (<$data_file>) {
my @fields = split "\t";
my $insert = join ",", @fields;
$dbh->do("insert into table values ($insert)";
}
APPROACH_2:
my $sql = "insert into table values (?,?,?)";
my $sth = $dbh->prepare($sql);
while (<$data_file>) {
my @fields = split "\t";
$sth->execute(@fields);
}
APPROACH_3:
my @inserts;
while (<$data_file>) {
my @fields = split "\t";
my $insert = '('.join ",", @fields.')';
push @inserts, $insert;
if (@inserts == 500) {
my $insert_500 = join ",\n", @inserts;
$dbh->do("insert into table values $insert_500";
undef @inserts;
}
}
# insert leftovers
Aucun commentaire:
Enregistrer un commentaire