A little bit of background. This is a script designed to narrow down a large data set (3+GB files). What I have is a series of SQL queries to create temporary tables for inserting/deleting from other tables.
Here is what the first few queries look like:
Query #1
create table clash as
select *
from
StallConnected
group by Store, Stall, StartTime
having
count(*) > 1;
Query #2
create table OverlappingStarts as
select A.*
from
StallConnected as A
join
clash as B
on
A.Store = B.Store
and
A.Stall = B.Stall
and
A.StartTime = B.StartTime
order by
A.Store, A.Stall, A.StartTime;
Now on to the meat of the issue. I'm executing these queries in sequence using a db connection in python's sqlite3 module on a single thread. Here's the code:
for i, val in enumerate(queries):
print "Step " + str(i + 1) + " of " + steps
db.executescript(val)
db.commit()
I know that executescript() will cause a COMMIT to happen before each statement is executed, but what happens is that it will perform the first query just fine, but the second query will simply hang. No exceptions, nothing.
I know it can't possibly be the timeout happening since this is running on a single thread. It doesn't throw an exception either (obviously, it just hangs). I know it hangs because the db-journal file is only 2KB.
What I've tried:
- Committing after every statement
- Closing/reopening the connection
- Using
execute()overexecutescript() - Using a cursor object over directly calling
execute()on the db connection
Any thoughts? Am I doing anything inherently wrong? Windows file locking issue that I don't know about?
Aucun commentaire:
Enregistrer un commentaire