The idea of this post and part of the contents are from the fantastic book The Pragmatic Pogrammer.
We don't usually test for most of the following resources:
- Anti-virus software: Some of them have aggressive heuristics that can confuse a binary socket communication with a trojan.
- Color depth: How does your website looks in 256 colors? and on a mobile device (if supported)?
- CPU bandwith: Slowness, long response wait times, too CPU intensive calculations.
- Disk bandwith: Similar to CPU bandwith, but caused ony when performing I/O.
- Disk space: Getting out of available disk space (or quota space) while writing any file.
- Fonts and DPI: Does your application support custom DPI schemes (for example, bigger characters)? Have you tried launching your application on a windows without your used font?
- Memory: OutOfMemory exceptions in .NET, Garbage collector being very slow or being called too much...
- Network bandwith: Timeouts in web services, slowness when having big network I/O...
- Power shortages: unplug the AC adaptor of your desktop pc while testing the application, and then launch it again. Does it wor, or at least recovers from the sudden shutdown?
- Restrictive ACLs and Account Policies: Don't assume your application will be able to write into windows registry, test for it (at least when launching the application if you're mad with performance hits).
- Video/Screen resolution: Have you tried your WinForms application in a 640x480 resolution? and in one of the now widely spread netbooks (which usually have wanky resolutions like 1024x600)?
- Wall clock time: Human perception of passing of time, versus time taken by a computer to complete a task. Do not take tests as "the whole truth", human testing of applications is always needed.
An example of the wall clock time is pre-Service Pack 1 Vista's file copy, which was in theory faster and better but actually took longer to copy the same file than in XP for a normal user (more info about it can be found here).