I spent some time working in quite agile, test-focussed environment, but lately I've been in organisations that like the idea of testing, but don't really practice it.
Trying to use testing helps both in proving something works, and because it makes you structure your code so it can be tested (more modular code, reduce coupling) which makes it easier to understand and maintain.
"Pure" test-driven development is a big step, and the existing code base may make it very difficult. So start small, pick the battle fronts where you can win, refactor, organise your tests. Use integration tests but focussed and well structured.
Tests vs time
The standard argument is "Writing tests takes lots of time so only do it if you have time at the end".
Initially it takes time, but unless your code is bugless first time, the iterations of manual testing/ fixing may take more time. So it's (time to code automated test) vs (time to manual test). The latter can add up over the lifetime of the software, and the major win of automated tests is finding regressions. Assess the cost-benefit, it's doesn't have to be 100% coverage full TDD vs nothing.
"You have to F5 and manually use your application anyway".
True, but automated tests can cover all the permutations you don't do in manual tests.
TDD vs Visual Studio
Autocomplete make test-before-code difficult. The consume first mode in Visual Studio 2010 looks like it will help. Up until now, the closest to real TDD that I've done involved stubbing the classes and methods before writing the tests.
TDD is more than testing of course. It's supposed to force you to design simple loosely coupled components. It certainly makes you focus on your API.
Tests vs refactoring
Refactoring tools (Resharper, CodeRush) mitigate some of this, but most major refactoring will have a big impact on a large bank of tests. So yes, many of your tests may be thrown away and you start again. The solution to to keep the tests well structured and named (yes, easier said than done).
Unit tests structure and readability
Things I've found helpful:
- the test project and it's subject should look identical: one test class per class. If the test class is too big, it's a good indication that the class is doing too much, but you can split up the test class with partial classes.
- TDD guides suggest you don't test private methods, and so far this has worked well for me. I don't use the MSTest private accessors, although very rarely I will write a little reflection code to set a private field rather than use the public property. Internal methods and InternalsVisibleTo are very useful.
- Internally in the test method, the "arrange / act / assert" comments split things up.
- Underline-delimited words in test method names are more readable than Pascal case.
- Test methods have to be readable, not elegant code : avoid simple refactorings (extract method) that hide what's going on, use lots of cut'n'paste code.
Unit tests vs Integration tests
Developers often have the idea that they should only do unit tests, and integration tests are a really bad thing. Of course you can and should refactor to use interfaces with DI / mocks so code can be unit tested without dependencies.
But data access and UI (at least non MVC flavours) are by definition mostly integrations and not easily testable. If you can test them with integration tests, you often have the most interesting and useful tests (unless you love manual testing). All those individual classes are pretty dumb and unit testing doesn't reveal as much.
Integration tests tend to be much longer and more complicated- there's almost always setup/ class or testInitialize and teardown / cleanup. But structure it well and after the writing the initial setup and first test, other tests just follow the same pattern and become very easy.
Integration tests tips
Break it down into small tests. The temptation is to do an end-to-end test (login-navigate-create record-find record-edit record-delete record-logoff). To start, one test = one operation/ form/ page (at most). For UI, you probably have to start with launch app/ login, but stop there- other tests can then call the login and carry on. If you change the login, one test changes and the others will be unchanged.
Database tests: A local SqlExpress database file is a neat testing target in the test project, if you can keep it up to date.
Database tests: You can wrap tests in transactions that are not committed (and thus rolled back)
Database tests: Linq2Sql is helpful to verify your data access is doing the right thing.
Web UI tests: for asp.Net you should use Watin (a framework to run IE - and now Firefox- from tests). You really can write the test, then code the webpage, in true TDD fashion (I did!). Visual Studio 2010 coded UI tests work similarly, with a nice recorder- not just IE, but Firefox, winforms, and WPF (Chrome/Safari/IE6 aren't supported, and amazingly Silverlight isn't either). Telerik has a similar web-test UI studio (IE, FF, Safari and Silverlight), but it isn't cheap.
File IO: In MSTest this is fairly easy in the deployment directories, which are temporary directories created each test run (under solution-level folder Test Results). By default, deployment is enabled (see in LocalTestRun.testrunconfig). Environment.CurrentDirectory (and TestContext.TestDeploymentDir) have the full path. Test assembly contents with Copy to Output Directory = Always copy do not get copied (only the dlls) so add the attribute [DeploymentItem("Data.xml")]