1

I am building a small financial web app with django. The app requires that the database has a complete history of prices, regardless of whether someone is currently using the app. These prices are freely available online.

The way I am currently handling this is by running simultaneously a separate python script (outside of django) which downloads the price data and records it in the django database using the sqlite3 module.

My plan for deployment is to run the app on an AWS EC2 instance, change the permissions of the folder where the db file resides, and separately run the download script.

Is this a good way to deploy this sort of app? What are the downsides? Is there a better way to handle the asynchronous downloads and the deployment? (PythonAnywhere?)

gpanterov
  • 1,365
  • 2
  • 15
  • 25
  • 1
    What about for you to maintain one database and then have an API that these child apps can connect to and access the data? That way you can keep one copy of the data current. – greg_diesel Apr 06 '15 at 21:32
  • I thought about this, but If the API is web based, (i.e. the web app must make a url requests), I don't think it would be fast enough because I am working with prices which are downloaded every second – gpanterov Apr 06 '15 at 22:13

1 Answers1

1

You can write the daemon code and follow this approach to push data to DB as soon as you get it from Internet. Since your daemon would be running independently from the Django, you'd need to take care of data synchronisation related issues as well. One possible solution could be to use DateTimeField in your Django model with auto_now_add = True, which will give you idea of time when data was entered in DB. Hope this helps you or someone else looking for similar answer.