A Harrowing Tale of User Acceptance Testing

My first encounter with user acceptance testing will scare you straight.

Recently, I wrote an article about user acceptance testing (UAT) and was reminded of my very first encounter with validation testing on a software project. Here is that story as I remember it.

WARNING: You may want to skip this story if the thought of wasted money makes you feel faint, or if you are severely allergic to anecdotes that sort-of fizzle out at the end.

A story of high-stakes validation

Long ago, as a fresh-faced college graduate, I was assigned to a software project for a big international company that built engines. The company was developing a new Windows application to replace the text-only PC tool they were currently using to calibrate and test their truck engines.

I joined the project relatively late in their long software development cycle. Most of my teammates had been on the project a little over a year, others much longer. We were mere months away from release, and I had been added to the project to help the team close out development on time.

In one of our weekly team meetings, the project manager detailed what the final stages of development and testing would look like: we would finish the development scope, support functional testing, and fix bugs as needed. Then the application would go to validation testing. After that, a decision would come: release the application, or shelve it.

Over the next few minutes, it became clear to me what “shelve it” actually meant. It did NOT mean “set it on a shelf for a while and then we’ll release it when the time is right.”

No, this was “shelve it” in the Raiders of the Lost Ark sense: box it up and shove it in a warehouse, never to be seen again.

Wait, what?

So… after several years of effort and tons of money spent (I didn’t know if it was millions, but it was at least hundreds of thousands) we’re going to finish the software… and then maybe just toss it out?

I remained astounded and baffled for the remainder of the meeting, as contingency plans were begun about how to responsibly document, package up, and store project materials if the decision was indeed to “shelve it”. Budget was set aside for a skeleton crew to handle the arrangements if the time came. (All of these plans were discussed with a calmness I could barely fathom at the time, but would later recognize as soul-crushed resignation.)

After the meeting, I asked a manager what he thought the chances were that the product would be shelved. I expected him to say something like 15 or 20 percent—a sort-of low, but still-disturbing number.

He shrugged and said, “like 50-50.”

Noticing my reaction to this, he told me it would be fine. All we could do is finish the project, just like we planned, and then see how it fares. He then explained to me why our new application may or may not be “shelved”.

Why we might throw away functioning software we just spent several years building

The answer was validation testing.

After the application was complete and it passed functional testing (verification) with flying colors, it would then be given to a sub-set of real users in the company to use. You may know this phase as beta testing, or user acceptance testing.

If validation testing didn’t produce positive enough results, it would mean that the product wasn’t doing a good enough job fulfilling its purpose and meeting the needs of the actual users for whom it was created. This would indicate one or more fundamental flaws in the application, and the company had already decided it would pull the plug if that were the case. And for good reasons:

  1. The cost of rolling out and maintaining a new tool was significant. Rolling out a fundamentally flawed tool, however, would be even more expensive and could negatively affect production.
  2. The new application was replacing an existing application that, while clunky and out of style, still got the job done.
  3. The software project was already overdue and over budget, at least compared to executives’ original expectations. Patience was pretty low. Pulling the plug on a failed expensive project would be unpleasant but relatively low-profile. On the other hand, a troubled product roll-out would be more expensive, highly visible, and might just cost somebody their job.
  4. Fixing one or more fundamental flaws in the application would be another long development effort, which would have roughly the same likelihood of success. (This was the era of classic waterfall development.)

So, did the product get released, or what?

Don’t hurt me, but the answer is: I’m not sure.

By the time validation testing started, I had already been assigned to a different project for a different client, so that part of my memory is haziest. The indelible effect the experience had on me, however, wouldn’t have been increased or lessened by either result.

But, if I’d heard that the application was killed, I’d remember it… right?

So, it very likely was released.



Is User Acceptance Testing always that terrifying?

Is validation testing always such a high-stakes all-or-nothing venture? Not really.

The particular circumstances of the company, the project, and the software development practices of the day all conspired to make the validation phase a tightrope walk in which the fate of the product teetered on the edge of oblivion.

I have to hope that was more common back then than it is today.

We certainly have more knowledge, tools, and best-practices today to help us remove and mitigate a lot of the risk from user acceptance testing. In retrospect, it was a bit forward-thinking of the company in my story to incorporate a validation phase as part of their process, and to have a rational willingness to stop throwing good money after bad. Today we are also, hopefully, better at initial research and goals analysis, iterative development and testing, periodic usability testing, etc.

But let my eye-opening experience from nearly 20 years ago serve as a reminder to you: user acceptance testing is worthy of your respect, and maybe a little healthy fear. You’re testing to see whether your product or feature actually fulfills its reason for being. If it performs badly, you may not have to throw everything in the trash, but there’s going to be some discomfort. You may need to postpone a launch, insert a development sprint to fix a flawed deployment, or, yes, maybe even go back to the drawing board.

Learn about how BetaTesting can help your company launch better products with our beta testing platform and huge community of global testers.

%d bloggers like this: