6

I am using SimpleTest, a PHP-based unit testing framework. I am testing new code that will handle storing and retrieving website comments from a database. I am at a loss for how to structure the project to test the database access code.

I am looking for any suggestions as to best practices for testing db code in a PHP application. Examples are really great. Sites for further reading are great.

Thank you kindly. :)

GloryFish
  • 13,078
  • 16
  • 53
  • 43
  • Can you elaborate what obstacle you face? Reading your questions and the answers, I can't really figure out what is holding you back from writing the code? – Till Oct 09 '08 at 23:27

8 Answers8

3

This is an old question but I thought I'd add some specific experience we've had with this.

Other posters are technically correct that this is a form of integration test but from where I sit there is often too much logic in MySQL to be stubbed out in unit testing. If you are like us and have large, complex services that are heavily dependent on MySQL (and often several tables per service), having a robust testing framework that includes testing query logic is really handy. We mock out a good number of our dependencies in our unit tests but not MySQL.

We have a set of classes that wrap simpletest to provide this functionality. It works something like this:

  • Instructions to create each database table are stored in a file at tests/etc/schemas/table.sql. It contains the schema data as well as inserts for all the canned data the test will expect to find.
  • Each test that requires the database extends a Test_DbCase class which provides functionality to build the tables.
  • A bootstrap class takes care of creating and dropping the database on construct and destruct.
  • At runtime, the test calls loadTables('foo', 'bar') in the setUp method to execute the sql commands in foo.sql and bar.sql.
  • Tests are run against the canned data..the rest is obvious.

One other tool we have is a bash script that makes it easier to create the table.sql files. This is really handy because otherwise we'd be writing the SQL by hand - you can take an existing set of tables, set up all your data in MySQL, and then export it to create the test files basically.

This works really well for us, though we ended up having to roll a lot of it ourselves.

KatieK
  • 13,586
  • 17
  • 76
  • 90
colin
  • 31
  • 2
1

I had a local database dedicated to unit testing with a known name and database username/password. The unit tests were hard-coded to that location but different developers could override those variables if they wanted.

Then before each test you TRUNCATE each table. This is much faster than dropping/creating tables or the database itself.

Note: Do not truncate after the tests! That way if a test fails you have the current state of the database which often helps diagnose the problem.

Jason Cohen
  • 81,399
  • 26
  • 107
  • 114
1

You might want to allow PHP to create and supply data to a temporary table/database and test on that table/database. Then you don't have to reset your database manually. Most frameworks have database manipulation libraries to make it easier. It might take time in the front end but will let you test much faster later when you make changes later.

Clutch
  • 7,404
  • 11
  • 45
  • 56
0

Testing against a database usually indicates bad tests, probably due to lack of encapsulation in the code under test. You should try to isolate the code that interacts with the database from the rest of your code as much as possible, keeping this interaction layer so simple that you can get away with a few, very basic tests.

In other words; The code that deals with comments, shouldn't be the same code that deals with database interaction. You could - for example - write a generic table module, that your comment model uses to access the database. You would still have to test the table module, but that should be done in isolation from the comment code.

troelskn
  • 115,121
  • 27
  • 131
  • 155
  • don't know why this has been voted down QUITE so harshly. Strictly speaking testing interactions with a database would be integration tests not unit tests so troelskn has a point. Of course, back in the real world... – reefnet_alex Oct 06 '08 at 21:12
  • This answer contains good advice, but it's not to the question asked. The question asked was how to test the db code, not how to test code above the db code. – Daniel Papasian Oct 09 '08 at 13:34
0

When testing database code, it's good to always have the same database as the starting point. Especially if you do unit-testing (which I assume is the case here). One of the ways is to truncate all tables as Jason suggested, but I prefer to have some starting data in it. You know, you always need to have some 'default' data that is present in each database.

Also, some tests only make sense with full database. So, create a special instance of database for those tests. I have about 3 or 4 different databases that I plug-in (just copy the files in) before running some tests. Having the same starting point each time ensures repeatability.

So, just prepare a few database states that are good 'starting points' and back them up. Before running each set of tests, restore appropriate database, and then run it.

Milan Babuškov
  • 59,775
  • 49
  • 126
  • 179
0

I'd encourage you to not try to test the database access code using SimpleTest.

Instead, create a functional test for your app using, for example, Selenium: record a test case when you start from a known state of a database; then add a comment and check (using Selenium's asserts) that the content indeed is there.

This way it is: - easier to set up and maintain - you verify not just the DB layer, but the presentation layer, too

That said, if you have stored procedures in your DB, do use SimpleTest - I've done it myself successfully. Basically, create SimpleTests that start from a known DB state, then perform a few INSERTS/UPDATES, then run the stored proc and make sure the state of the DB is what you'd expect.

Alex Weinstein
  • 9,823
  • 9
  • 42
  • 59
0

If you really want to test against a database, I would recommend to import data/create tables before each test. That way, your database starts from a known state on each test. Since this is rather performance-expensive, you can start a transaction (provided that your rdms supports it) in setUp and rollback in tearDown. MySql (Which is likely the RDBMS you're using), doesn't support nested transactions, so if the code under test uses transactions, you can run into trouble. You can get around this, using savepoints. Set up a savepoint before testing, and rollback to savepoint after test.

I'll still maintain that if you need much of this, you should consider the possibility that your tests are trying to tell you something ..

troelskn
  • 115,121
  • 27
  • 131
  • 155
0

I think you should use an ORM, and write a few integration tests for that. If the integration tests show you that it works perfectly under the actual environment, then you have to test it again only when you change your environment (database, php version, platform, etc...). After that you can mock up the ORM object, and you won't need to connect to the database.

So I think this is the best way, but if you don't want to use an ORM, then you can create a test database and mock up the database connection (PDO) object. In that case you can create and drop test tables in the setUp and tearDown sections of your testCases. It's important that these are integration tests, not unit tests, so you don't need to run them always, only when something changed beetween the PHP and the SQL server. After you tested your data access objects with your integration tests, you have to mock them up in your unit tests.

inf3rno
  • 24,976
  • 11
  • 115
  • 197