mercredi 23 décembre 2015

How to speed up the sqlite insert operation for large data

I wanted to insert the data in qslite3 database in linux script. The code below is working and inserting the data. But it take very much time to insert data when data rows are in thousands. How to reduce the time to insert large data.

I have googled and found execute and commit; but does not know how to implement. Any idea please?

find all the files in pkgdir and then filter out the first . and / with sed

        local files=( $( find  -L -type f -print | sed 's/^..//' ))             

        #insert package files into sqlite3 db           
        # find total number of files in an array
        #local arraylength=${#files[@]}
        #echo "Total files in array : ${arraylength}"
        colored_echo "Inserting package files into sqlite database..." blue

        for file in ${files[@]};do              
            #echo "Package files are : ${file}"             
            sqlite3 "${dbpath}/${dbname}" "INSERT into files (files) values ('${file}');"                       
        done

Aucun commentaire:

Enregistrer un commentaire