I know that it's strongly recommended to run unit-tests in separation from file system, because if you do touch file system in your test, you also test file system itself. OK, that's reasonable.
My question is, if I want to test file saving to the disk, what do I do? As with database, I separate an interface that is responsible for database access, and then create another implementation of this for my tests? Or may be there's some other way?

- 8,328
- 17
- 68
- 113
-
1Create an interface to access the file system. Then create its mock, either using a mocking framework or by-hand. – P.K Aug 01 '10 at 13:00
-
thanks very much for useful comments and answers, I think I'll stick with the mocking framework – chester89 Aug 01 '10 at 16:26
-
1Unit test don't use any systems. It's not integration tests. Unit test don't test the last layer, which saves file to disk. You need integration tests to save file. And in this case you simple save it and then check that it was created. It's a different type of test, different approaches and different story. Most of answers below are wrong while they talk about unit-tests but try to intrude integration tests practicies – Artem A Jan 25 '17 at 14:30
3 Answers
My approach towards this is heavily biased on the Growing Object-Oriented Software Guided by Tests (GOOS) book that I just read, but it's the best that I know of today. Specifically:
- Create an interface to abstract away the file system from your code. Mock it where this class is needed as a collaborator/dependency. This keeps your unit-tests quick and feedback fast.
- Create integration tests that test the actual implementation of the interface. i.e. verify that calling Save() actually persists a file to disk and has the write contents (use a reference file or parse it for a few things that it should contain)
- Create an acceptance test that tests the whole system - end to end. Here you may just verify that a file is created - the intent of this test is to confirm if the real implementation is wired / plugged in correctly.
Update for commenter:
If you're reading structured data (e.g. Book objects) (If not substitute string for IEnumerable)
interface BookRepository
{
IEnumerable<Books> LoadFrom(string filePath);
void SaveTo(string filePath, IEnumerable<Books> books);
}
Now you can use constructor-injection to inject a mock into the client class. The client class unit tests therefore are fast ; do not hit the filesystem. They just verify that the right methods are called on the dependencies (e.g. Load/Save)
var testSubject = new Client(new Mock<BookRepository>.Object);
Next you need to create the real implementation of BookRepository that works off a File (or a Sql DB tommorrow if you want it). No one else has to know. Write integration tests for FileBasedBookRepository (that implements the above Role) and test that calling Load with a reference file gives the right objects and calling Save with a known list, persists them to the disk. i.e. uses real files These tests would be slow so mark them up with a tag or move it to a separate suite.
[TestFixture]
[Category("Integration - Slow")]
public class FileBasedBookRepository
{
[Test]
public void CanLoadBooksFromFileOnDisk() {...}
[Test]
public void CanWriteBooksToFileOnDisk() {...}
}
Finally there should be one/more acceptance tests that exercises Load and Save.
-
-
2If it is not too much to ask. Do you think you can provide some codes for bullet 1 and 2? I would greatly appreciated it. – Thang Pham May 23 '11 at 15:02
There is a general rule to be cautious of writing unit tests that do file I/O, because they tend to be too slow. But there is no absolute prohibition on file I/O in unit tests.
In your unit tests have a temporary directory set up and torn down, and create test files and directories within that temporary directory. Yes, your tests will be slower than pure CPU tests, but they will still be fast. JUnit even has support code to help with this very scenario: a @Rule
on a TemporaryFolder
.
That said, most file writing code takes this form:
- Open an output stream to the file. This code must cope with missing files or file permissions problems. You will want to test that it opens the file and copes with those failure conditions.
- Write to the output stream. This must write in the correct format, which is the most complicated part requiring the most testing. It must cope with an I/O error while writing to the output stream, although those errors are rare in practice.
- Close the output stream. This must cope with an I/O error while closing the stream, although those errors are rare in practice.
Only the first really deals with the file system. The rest just uses an output stream.
So you could extract the middle and last part to its own method (function), which manipulates a given output stream, rather than a named file. You can then mock that output stream to unit test the method. Those unit tests will be very fast. Most programming languages already provide a suitable output stream class.
That leaves only the first part to be unit tested. You will need only a few tests, so your test suite should still be acceptably fast.

- 46,613
- 43
- 151
- 237
-
1See also http://stackoverflow.com/questions/225073/testing-whats-written-to-a-java-outputstream – Raedwald Jul 05 '14 at 13:32
You could instead of passing a filename to your save function, pass a Stream, TextWriter or similar. Then when testing you can pass a memory-based implementation and verify the correct bytes are written without actually writing anything to disk.
To test problems and exceptions you could take a look at a mocking framework. This can help you to artifically generate a specific exception at the certain point in the save process and test that your code handles it appropriately.

- 811,555
- 193
- 1,581
- 1,452
-
1I wonder if @chester89 also wants tests to handle `invalid path`, `file collision`, `locked file`, `invalid filename`, `save interruption` etc. I suppose all of these could be mimicked as well. – scunliffe Aug 01 '10 at 11:42
-
@scunliffe, yes, I'd like to test these things too. how can I mimick these behaviour? – chester89 Aug 01 '10 at 11:48
-
@chester: You could take a look at a mocking framework. This can help you to generate these exceptions at the appropriate times. Be careful to think before going down the path of testing every single possible error condition and having 100% code coverage from tests. This is often more expensive than it is worth. – Mark Byers Aug 01 '10 at 11:58
-
"before going down the path of testing every single possible error condition and having 100% code coverage from tests. This is often more expensive than it is worth" Can you please explain why? I don't agree with you. Especially, in this case it is worth going for 100% coverage. chester89 may not necessarily use a mocking framework, he can also create a mock manually. – P.K Aug 01 '10 at 12:59
-
7@tugga: You have to weigh up the benefit of testing every case versus the development cost. If you use existing file-selector dialogs then most of the problems with invalid paths will already be correctly handled by the component. If you are paranoid you can imagine situations where the file selector will return you an invalid file name, but these situations are very rare and are generally bugs in the framework. If you spend 90% of your development time testing for things that will rarely happen then you only have 10% of time to spend on developing features that people will pay for. – Mark Byers Aug 01 '10 at 13:09