I am using sqlite database to store the data. My python program has a thread pool which contains 5 threads. I was creating a single database connection and sharing it with all 5 threads but some times it throws exception which was not captured by any sqlite exception or generic exception and my python script got killed automatically. After searching for the solution I came across How to share single SQLite connection in multi-threaded Python application
And I created a separate connections as follows,
class ProcessJob(object):
....
def process_job(self):
job = queue.get()
if job = 'xyz':
with sqlite3.connect(database_path, check_same_thread=False, timeout = 10) as db_conn:
db_conn.execute("insert query on table ABC")
db_conn.commit()
elif job = 'pqr':
with sqlite3.connect(database_path, check_same_thread=False, timeout = 10) as db_conn:
db_conn.execute("update query on table ABC")
db_conn.commit()
elif job = 'mno':
with sqlite3.connect(database_path, check_same_thread=False, timeout = 10) as db_conn:
db_conn.execute("insert query on table FOO")
db_conn.commit()
class MyThread(therading.Thread):
....
process_job_obj = ProcessJob()
def run(self):
while True:
try:
process_job_obj.process_job()
except Exception as e:
logger.exception('Exception : %s'%e)
def main():
for i in range(5):
trd = MyThread()
trd.start()
if __name__ == "__main__":
main()
So, Is this a right approach or is there any flaw or chances of stopping/killing the python script?
Aucun commentaire:
Enregistrer un commentaire