I try to store (an admittedly very large) BLOB into an sqlite database using SqlAlchemy.
For the MCVE I use ubuntu-14.04.2-desktop-amd64.iso as BLOB I want to store. Its size:
$ ls -lhubuntu-14.04.2-desktop-amd64.iso
... 996M ... ubuntu-14.04.2-desktop-amd64.iso
The code
from pathlib import Path
from sqlalchemy import (Column, Integer, String, BLOB, create_engine)
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
from sqlite3 import dbapi2 as sqlite
SA_BASE = declarative_base()
class DbPath(SA_BASE):
__tablename__ = 'file'
pk_path = Column(Integer, primary_key=True)
path = Column(String)
data = Column(BLOB, default=None)
def create_session(db_path):
db_url = 'sqlite+pysqlite:///{}'.format(db_path)
engine = create_engine(db_url, module=sqlite)
SA_BASE.metadata.create_all(engine)
session = sessionmaker(bind=engine)
return session()
if __name__ == '__main__':
pth = Path('/home/user/Downloads/iso/ubuntu-14.04.2-desktop-amd64.iso')
with pth.open('rb') as file_pointer:
iso_data = file_pointer.read()
db_pth = DbPath(path=str(pth), data=iso_data)
db_session = create_session('test.sqlite')
db_session.add(db_pth)
db_session.commit()
Running this raises the error
InterfaceError: (InterfaceError) Error binding parameter 1 - probably unsupported
type. 'INSERT INTO file (path, data) VALUES (?, ?)'
('/home/user/Downloads/iso/ubuntu-14.04.2-desktop-amd64.iso', <memory
at 0x7faf37cc18e0>)
I looked at the sqlite limitations but found nothing that should prevent me from doing this. Does SqlAlchemy have a limitation?
Everything of this works fine for this file:
$ ls -lh ubuntu-14.04.2-server-amd64.iso
... 595M ... ubuntu-14.04.2-server-amd64.iso
Is there a data size limit? or what do I have to do differently when the file size surpasses a certain (where would that be?) limit?
Aucun commentaire:
Enregistrer un commentaire