I have a django application (a PoC, not a final product) with a backend library which uses a SQLite database to obtain data needed for processing - read only. The SQLite database is part of the repo and deployed to Heroku during the usual deployment process. This is working fine.
Now I have received the requirement to allow for updates to this database via the django admin interface. Please note that this is not a django managed database, so from django's point of view, this is just a binary file.
I could allow for a FileField to handle this, overwriting the database; I guess this would work fine in a self-managed server, but I am on Heroku, and there you have the constraints imposed by Disk Backed Storage (Note that although that is explained in the context of SQLite used as web app database, this is not my case: my SQLite is not my webapp database. But the limitations apply the same: I can not expect to write to the webapp's filesystem and to get any guarantee that the new data will be actually visible by the running webapp)
I can think of other alternatives, all with drawbacks:
- put the SQLite database in another server (a "media" server), and access it remotely: this will severely impact performance. Besides, accessing SQLite databases over the network does not seem easy.
- create a deploy script for the customer to upload the database via the usual deploy mechanisms. Since the customer is not technically fit, and I can not provide direct support, this is unfeasible.
- and probably the easiest option: move out of Heroku to a self-managed server, so that I can implement this quick-and-dirty upload without much complications.
Have you faced this kind of problem before? Do you have another suggestion?
Aucun commentaire:
Enregistrer un commentaire