0

Sorry for the long post...

While being introduced to a brown field project, I'm having doubts regarding certain sets of unit tests and what to think. Say you had a repostory class, wrapping a stored procedure and in the developer guide book, a certain set guidelines (rules), describe how this class should be constructured. The class could look like the following:

public class PersonRepository
{
public PersonCollection FindPersonsByNameAndCity(string personName, string cityName)
{
    using (new SomeProfiler("someKey"))
    {
        var sp = Ioc.Resolve<IPersonStoredProcedure>();

        sp.addNameArguement(personName);
        sp.addCityArguement(cityName);

        return sp.invoke();
    }
} }

Now, I would of course write some integration tests, testing that the SP can be invoked, and that the behavior is as expected. However, would I write unit tests that assert that:

  • Constructor for SomeProfiler with the input parameter "someKey" is called
  • The Constructor of PersonStoredProcedure is called
  • The addNameArgument method on the stored procedure is called with parameter personName
  • The addCityArgument method on the stored procedure is called with parameter cityName
  • The invoke method is called on the stored procedure -

If so, I would potentially be testing the whole structure of a method, besides the behavior. My initial thought is that it is overkill. However, in regards to the coding practices enforced by the team, these test ensure a uniform and 'correct' structure and that the next layer is called correctly (from DAL to DB, BLL to DAL etc).

In my case these type of tests, are performed for each layer of the application.

Follow up question - the use of the SomeProfiler class smells a little like a convention to me - Instead creating explicit tests for this, could one create convention styled test by using static code analysis or unittest + reflection?

Thanks in advance.

jaspernygaard
  • 3,098
  • 5
  • 35
  • 52

2 Answers2

0

I think that your initial thought was right - this is an overkill. Although you can use reflection to make sure that the class has the methods you expect I'm not sure you want to test it that way.

Perhaps instead of unit testing you should use some tool such as FxCop/StyleCop or nDepend to make sure all of the classes in a specific assembly/dll has these properties.

Having said that I'm a believer of "only code what you need" why test that a method exist, either you use it somewhere in your code and in that can you can test the specific case or you don't - and so it's irrelevant.

Dror Helper
  • 30,292
  • 15
  • 80
  • 129
  • I'll mark your response as the answer. Was hoping for a little more discussion about testing business value vs. structure of code. But I have a little input for the next code review. – jaspernygaard Jul 05 '11 at 09:23
0

Unit tests should focus on behavior, not implementation. So writing a test to verify that certain arguments are set or passed in doesn't add much value to your testing strategy.

As the example provided appears to be communicating with your database, it can't truly be considered a "unit test" as it must communicate with physical dependencies that have additional setup and preconditions, such as availability of the environment, database schema, existing data, stored-procedures, etc. Any test you write is actually verifying these preconditions as well.

In it's present condition, your best bet for these types of tests is to test the behavior provided by the class -- invoke a method on your repository and then validate that the results are what you expected. However, you'll suddenly realize that there's a hidden cost here -- the database maintains state between test runs, and you'll need additional setup or tear-down logic to ensure that the database is in a well-known state.

While I realize the intent of the question was about the testing a "black box", it seems obvious that there's some hidden magic here in your API. My preference to solve the well-known state problem is to use an in-memory database that is scoped to the current test, which isolates me from environment considerations and enables me to parallelize my integration tests. I'd wager that under the current design, there is no "seam" to programmatically introduce a database configuration so you're "hemmed in". In my experience, magic hurts.

However, a slight change to the existing design solves this problem and the "magic" goes away:

public class PersonRepository : IPersonRepository
{
      private ConnectionManager _mgr;

      public PersonRepository(ConnectionManager mgr)
      {
         _mgr = mgr;
      }

      public PersonCollection FindPersonsByNameAndCity(string personName, string cityName)
      {
           using (var p = _mgr.CreateProfiler("somekey"))
           {
                 var sp = new PersonStoredProcedure(p);

                 sp.addArguement("name", personName);
                 sp.addArguement("city", cityName);

                 return sp.invoke();
           }
      }
}
bryanbcook
  • 16,210
  • 2
  • 40
  • 69
  • While I complete agree with you about decoupled code and the value of unit- vs. integrationtests, it's a little beside my intention of this post and most importantly the test case. Let's assume we use TypeMock or StructureMap to decouple the dependencies. I'll alter the example code a little... – jaspernygaard Jun 24 '11 at 13:42