I'm a big fan of unit testing as a bootstrap for running code in complicated systems.
Having a unit test framework allows you to run code that would otherwise require complicated deployment.
It's also a good exercise in finding out just how much of the system you're pulling along - how much configuration you have to do, how many dependencies you need to satisfy.
I usually end up using the framework for tests that would not qualify as unit tests: Tests start at the bottom and build up incrementally.
Ideally from unit to system tests you get a pyramid: Lots of very fast unit tests at the base covering a lot of ground. Composite tests higher up test your integration. The higher you go, the more complicated your infrastructure gets, but the number of tests should decrease based on the confidence that the simple fast tests give you.
And speed is where Visual Studio falls down, breaks it's neck and asks you to pay the bill.
I have a 1 GB RAM VM running Windows XP with Visual Studio 2008. A single unit test, for a single method takes between 11 and 18 seconds to complete.
The actual tests require miliseconds to complete. The rest of the time is - I guess - spent in hooking up the whole VS apparatus.
Using mstest on the commandline cuts the execution time in half but triples the hassle.
(only to be pedantic: the complete rutema test suite - unit tests and system tests - on the same VM runs in a fraction of that time - about 7 seconds - including any interpreter startup times)
This kind of performance completely defeats the purpose of unit testing: people will just not build any tests if it takes so long to get them running, your technical debt increases and you find yourself with software that will not run.
Microsoft really needs to address this point.