I'm trying to load a local DB with SQLite under a RedHat Linux server. I have a C code to load the database from a very large file spliting the columns. The bad new is that sqlite3 is not installed in the machine and I won't be able to have permissions to install libsqlite3-dev
(acording to this) , so I could only use it throuth bash or python.
What would be faster of the following options?
-
Split the columns in my C program, and then execute the insert like this:
system("echo 'insert into t values(1,2);'" | sqlite3 mydb.db);
-
Split the columns in my C program, save it to a temp file and when I've got 500.000 rows I execute the script like this (and then empty the temp file to continue loading rows):
system("sqlite3 mydb.db < temp.sql);
-
Split the columns in my C program adding a delimiter between them, save it to a temp file and import it like this:
.delimiter '@' .import temp.txt t
Aucun commentaire:
Enregistrer un commentaire