Articles tagged with: Testing

Windows Live Agents Testing: Introduction

Update: Parts II and III of this posts available.

Developing a Windows Live Agent is much like developing any software solution. Your team creates code, you test it, the customer tests it from the user side, you fix bugs and add functionality, and the cycle goes on until you're finished.

Although the new 5.0 SDK will feature full Visual Studio integration (and with that, Team Foundation Server and/or Visual Sourcesafe integration), current version is a bit more oriented to single-user development.

After working for more than a year developing WLAs, we at ilitia have built a small infrastructure (mostly virtualized/using virtual machines). Right now our WLA department works like this diagram:

Each developer works with a local copy of the WLA (using Visual Sourcesafe to integrate changes), having a local testing Microsoft Passport-enabled account.

We have a "WLA Server" in which we upload (at least once per week) the latest working build, which runs in another account that we give to our customer since the second or third week of development, so they can start testing the agent early and giving feedback (agile development and that stuff ;), just using their Windows Live Messenger clients.

We create mocks of customer Web Services if it is feasible (somethimes they're too complex, or the customer already has a testing WS created) and the Activity Window web pages and host them too, until the Activity is provisioned to Messenger.

We also have a private WLA intranet with our own tutorials, development guidelines and repository for Buddyscript modules, documents, testing account-password pairs and such.

One feature we're still missing is automatic unit test battery launching, like in a Continuous Integration server. But we do have unit testing since two months or so. If you want some info about the platform's default testing capabilities you can check some official posts of the WLA team's blog.

The system was somewhat lacking advanced features and more testing options, and coming from the NUnit testing world, I decided to make a new unit testing framework from scratch (although I used some of the original framework as basis, it has been heavily modified and improved).

I won't show any code until the next post, but this is an example of a sample passing battery output:

-----------------------
Begin testing: TestUtilitiesAddon -> TestExCompareQueryArray() Test Battery
Test: "TestExCompareQueryArray"
Test: "TestExCompareQueryArrayNotEqual"
Test: "TestExCompareQueryArrayNotEqual"
Test: "TestExCompareQueryArray"
Done testing TestUtilitiesAddon -> TestExCompareQueryArray() Test Battery
Total tests: 4 Passed tests: 4 Failed tests: 0
Test battery passed
-----------------------

(as you might notice, I've used the framework to test it's own testing/assert methods ;)

A sample output of a failed test could be like this one:

-----------------------
Begin testing: TestUtilitiesAddon -> TestExCompareObjects() Test Battery
Test: "TestExCompareObjects"

Error: "not object #2" failed.
Returned: "not object #2"
"not object #2" not an object
-----------------------
Done testing TestUtilitiesAddon -> TestExCompareObjects() Test Battery
Total tests: 1 Passed tests: 0 Failed tests: 1
Test battery failed
-----------------------

In the next post I will show the basics of building a testing framework and how to test both variable values and objects.


To mock or not to mock, that is the question

I'm right now in the final weeks of the first phase of a project. We're very tight on schedule and having to leave a few features for the next phase. Although the project may be delayed because of the client, we want it to be because of them, not us.

One thing that I've done in the project has been using mock objects for simulating an important part of the architecture, a web service which we have to access to do some business logic. This web service has to ve provisioned by the client to us, and because it is a big company, they seem to have delays and lots of approvals before anything important gets done.

So we are nearly 3 weeks before deadline, and we still don't have access to that web service, to the real source. If it wasn't because of the mock-WS I've implemented, we couldn't finish on schedule. I couldn't have coded nearly 50-60% of the really important logic the application needs.

We have frecuent meetings with the client (we're doing XP and specifically Scrum), and they gave us the WSDL and the specification of how the data from the WS will come, so I've mocked it (using a simple ASP.NET Web Service) and right now we haven't goy any problem coding the business logic.

And not only that. By having a mock, I've created some additional commands to enable and disable by specific WS calls things like simulated timeouts, malformed data, missing data and other specific aspects. I've writen tests inside the code that does things like:

  1. Activate timeout
  2. Call the WS
  3. Test if everything went ok and the timeout was catched by the code
  4. Deactivate timeout
  5. Call the WS again
  6. Test if this time data was gathered without problems

While mock objects don't have to be used always and are not "the ultimate salvation", sometimes they become really, really handy, and in my case, critical to reaching all project's requisites (and deadline, I hope ;)

Leverage not just "the extra development time" but the extras they can bring, as better and more deep tests, fast and non-intrusive simulations, and less dependencies on external sources.


NUnit 2.4.1 & Watin 1.1.0

Summary: NUnit 2.4.1 has a new constrained assert model that is more like natural language. WatiN 1.1.0 has been released and can be integrated with NUnit for web application testing.


VS2005 Unit Tests and copying files

Sometimes you need to copy some files to the output folder before executing your application from Visual Studio. This isn't a problem, unless you're going to execute a VS2005 unit test battery, because it creates a custom output folder before launching the test battery, so it's no easy task to move for example a xml file there.

This is an example of how a test output folder looks:
Example output folder

In the Test menu -> Edit Test Run Configurations -> (your .testrunconfig file)
Output folder configuration

Select the Deployment item, and here we are, all the files and directories added will be automatically copied before executing the tests.
Output folder configuration

Many thanks to Sibille for pointing me to the original source of this.


TDD and Testing

Summary: Discusses various articles related to Test Driven Development and testing, including an article about unit testing in Ruby, an explanation of mock objects in Java, an article about the pros and cons of white and black box testing, and an article about TDD Anti-Patterns. The post also mentions the release of NUnit 2.2.9.