3

I'm looking for some best-practice advice on how to go about writing tests in this specific scenario I have.

I've got a Django API that runs against a postgres database (which I have control over on my server) as well as a remote read-only MySQL database. The read-only db is provided by an external service, and they do provide a pretty liberal API for writing data to their database (so we overwrite our .save method to write to the API, rather than using the database directly), but this is all production data, so not something I want to mess with for tests.

Do people have thoughts on how to organize this project so that it sets up a similar MySQL database, or just continuously mocks every write (in a way that writing API endpoint integration tests are still do-able).

Happy to provide more details if needed.

NotSimon
  • 1,795
  • 1
  • 21
  • 31

1 Answers1

0

I would suggest you to follow a common practice for running tests:

  1. Dump a small set of data from your existing database to fixtures files in JSON format.
  2. Create a temporary SQLite/MySQL database for each test run and populate it with data from the fixtures files.

In this case, you could do what ever you want with your test database, because it gets thrown away. In addition, your tests wouldn't depend on the data in production environment.

niekas
  • 8,187
  • 7
  • 40
  • 58
  • Creating fixtures may not be allowed for companies that store extremely secure information. Creating and then obfuscating fixtures for highly secure data which is pulled from hundreds of tables is busy-work that is an absolute waste of time when we could just use the actual database connection. – Rjak May 20 '20 at 07:07