I have never been particularly fond of the ubiquitous phrase, “where the rubber meets the road.” However, if ever there were a situation when that phrase applies, it is user acceptance testing (UAT). After all the planning, designing, meetings, reviews and internal testing, there is one final act before the buyer (or internal client) accepts the software or solution. Yes, your team and you have worked months to perfect it but the real gate for saying it is ready is when it is tested by real users on their system.
Bad User Acceptance Testing Strategies
Waiting until the end of development to think about user testing: NO-Start early. Plan the user testing from the beginning of the project. Build user testing into the schedule. Do NOT fall victim to the strategy of, “we’ll rely on the results of system testing and skip user testing to catch up with a delayed schedule.” Those of us with lots of years experience can tell you that you build in testing and most surely with actual users if you want the system to work in the production environment.
Not allowing plenty of time for user testing: In a report from Sentry, they found that (and I paraphrase from their report: “assume that you have 120 test items such as producing a specific report or scanning 10 batches of 30 documents each into a database or generating an export file of 600,000 records. Further, assume it takes about 3 hours to setup each test cycle. Do the math. The time to setup, test, collect data and analyze approximates 360 hours.) So how many projects build that into their schedule? The ones that do I will bet find lots of real world issues and work them out before deployment.
Your project’s testing requirements will differ from any other projects because your product and situation are unique. However, the point is that meaningful and useful user testing takes time and planning. You might even do something novel like include users up front in planning what testing and acceptance criteria they will use!
Here’s another failing user testing strategy from our friend, Dilbert.
OK- so enough bad strategies, how about a few tips on what to do right:
Good Advice on User Acceptance Testing:
- Include user-testing requirements in the original proposal, plan, cost and schedule. Use input from your business analyst or project team – that include customer representatives – to generate typical scenarios and scripts for user testing.
- Create a detailed plan of what the UAT is testing. You do not want a bug report that says, “The system failed to turn straw into gold.” User testing should be done based on agreed requirements, not everything a user may want the system to do. In other words, bound the scope of the testing to be in line with the solution.
- Select test subjects (let’s call them users) with a range of experience and diversity of backgrounds following pre-defined scripts to get from task beginning to task complete. Do not pick all junior or all experienced users, a mix is the best.
- Train your users (test subjects) how to report problems. “User Acceptance Testing War Stories – Writing a Perfect Bug” suggests that testers write down the steps they followed that led to failure. Then wait a few hours and try to follow the steps as they wrote them to see if the bug appears again. You are teaching testers how to find bugs and report them with sufficient detail to developers to repair. Use some online tool if possible – it makes it easier. Even a SharePoint list can be used to track the items that come out of the UAT.
- Maintain detailed reports of testing including procedures and outcomes. Communicate findings with the team, management and clients. Depending on the timing of the UAT, you may want to submit status reports daily or weekly. Besides a cursory pass/fail, add some analytical information about causes or commonalities.
Remember that user testing is about making sure the system does what the customer claimed was needed to help them do their job. If the specifications and earlier plans, all agreed to by the parties involved, resulted in a product that does not support user’s needs, document the revised needs and begin the process of negotiating the follow on contract.
Best Practices and Lessons Learned:
This is one of my favorite topics and I have made almost every mistake from cutting out user testing to letting users test things not included in the original scope and requirements. A couple of Lessons learned that I hope you can avoid:
- Get the client/stakeholders involved at the beginning to define what the testing will need to be. It is like having the students help write the test – they can’t say it was a bad test.
- Balance the formality of the UAT with the scope and length of the project and solution. I have seen too much formality for a small system acceptance and not enough rigor for a large scale solution being deployed for thousands of users.
- Don’t start out user testing or UAT with negative points – in other words don’t sound like a lawyer! If you have done everything right, including getting the users to help up front, then this is an exciting time where they get to try out the system before everyone else. Kind of like test driving the new model car before everyone else.
So what are your tips or experiences in UAT? I am sure others would love to hve you share.
Thanks for sharing.
July 15, 2011 at 8:29 am
Great topic. Early involvement by users is essential. It builds a sense of ownership with the users and makes them part of the project team. Waiting until the end to include the users is a recipe for disaster.
July 23, 2011 at 11:36 am
I agree with your comment 100%. In fact the point of “Agile” methods that I like, IS the involvement of users early and often!