0

The Problem

Say I've got a cool REST resource /account.

I can create new accounts

POST /account
{accountName:"matt"}

which might produce some json response like:

{account:"/account/matt", accountName:"matt", created:"November 5, 2013"}

and I can look up accounts created within a date range by calling:

GET /account?created-range-start="June 01, 2013"&created-range-end="December 25, 2013"

which might also produce something like:

{accounts: {account:"/account/matt", accountName:"matt", created:"November 5, 2013"}, {...}, ...}

Now, let's say I want to set up some sample data and write some tests against the GET /account resource within some specified creation date range.

For example I want to somehow insert the following accounts into the system

name=account1, created=January 1, 2010
name=account2, created=January 2, 2010
name=account3, created=December 29, 2010
name=account4, created=December 30, 2010

then call

GET /account?created-range-start="January 2, 2010"&created=range-end="December 29,2010"

and verify that only accounts 2 and 3 are returned.

How should I insert these sample accounts to write my tests?

Possible Solutions

1) I could use inversion of control and allow the user to specify the creation date for new accounts.

POST /account
{account:"matt", created="June 01, 2013"}

However, even if the created field were optional, I don't like this approach because I may not want to allow my users the ability to set the creation date of their account. I surely need to be able to do it for testing but having that functionality as part of the public api seems wrong to me. Maybe I want to give a $5 credit to anyone who joined prior to some particular day. If they can specify their create date users can game the system. Not good.

2) I could add one or more testing configuration resources

PUT /account/creationDateTimestampProvider
{provider="DefaultProvider"}

or

PUT /account/creationDateTimestampProvider
{provider="FixedDateProvider", date="June 01, 2013"}

This approach affords me the ability to lock down these resources with security constraints so that only my test context can call them, but it also necessarily has side effects on the system that may become a pain to manage, especially if I have a bunch of backdoor configuration resources.

3) I could interact directly with the database circumventing the REST api altogether to set my sample data.

INSERT INTO ACCOUNTS ...

GET /account?...

However this can allow me to get into states that using the REST api may not allow me to get into and as the db model evolves maintaining these sql scripts might also be a pain.

So... how do i test my GET /account resource? Is there another way I'm not thinking of that is more elegant?

Matthew Madson
  • 1,643
  • 13
  • 24

2 Answers2

1

There are a lot of ways to do this, and you've come up with some solid (though maybe not perfect for your situation) solutions.

In the setup for the test, I would spin up an in-memory database like HSQLDB (there are others) and do the inserts. The test configuration will inject the appropriate database configuration into your service provider class. Run the tests, and then shut the database down on teardown.

This post provides a good example at least for the persistence side of things.

Incidentally, do not change the API of your service just to help facilitate a test. Maybe I misunderstood and you aren't anyway, but I thought I would mention just in case.

Hope that helps.

Vidya
  • 29,932
  • 7
  • 42
  • 70
  • if I understand you correctly, you're advocating the 3rd approach with an in memory db (at least during tests) and perhaps the production db for creating sample data. – Matthew Madson Oct 10 '13 at 00:12
  • In testing, yes. I would tightly couple the database operations to the test itself by doing them in setup/teardown. For me, this is a good example of tight coupling so I can see what the preconditions for my test are. I can't speak to production because there are usually a lot of considerations there. – Vidya Oct 10 '13 at 00:17
  • By the way, we did skip the test phase, where the testers will be providing the data. This database will be changing far less often, and scripts will be running far less. Check this out for dealing with database changes: http://www.amazon.com/Refactoring-Databases-Evolutionary-paperback-Addison-Wesley/dp/0321774515 – Vidya Oct 10 '13 at 00:32
  • No problem. Of course accepting the answer or at least an upvote never hurt anyone haha. – Vidya Oct 12 '13 at 05:17
0

For what it's worth, these days I'm primarily using the second approach for most of my system level (black box) tests.

I create backdoor admin / test apis that have security requirements which only my system tests can access. These superpower apis allow me to seed data. I try to limit the scope of these apis as much as possible so they are not overly coupled to the specific implementation details but are flexible enough to allow specifying whatever is needed for the desired seed data.

The reason I prefer this approach to the database solution that Vidya provided, is so that my tests aren't coupled to the specific data storage technology. If I decide to switch from mongo to dynamo or something like that; using an admin api frees me from having to update all of my tests--instead I only need to update the admin api/impl.

Matthew Madson
  • 1,643
  • 13
  • 24