Whether you’re a startup or a mature corporation, you should user-test new features and UI designs. Sometimes design testing will end in failure, but you can learn from your mistakes, as well as others’. Here are some lessons learned from real-life design testing failures.
When you redesign your product, you do it to satisfy your users in a way that meets your business goals. If you fall short, it could cost you. So it pays to test your new features and UI designs with some of your users before rolling those changes out to the whole world.
You’ll find, however, that design tests often fail—meaning that your new design didn’t perform better than the old one.
Take solace in the fact that those failures happen to everyone. Take heart in the fact that there’s a lot you can learn from those failures. And take some of the lessons learned by others and use them as your own.
Here are some real-life examples of design test failures and lessons-learned from two companies: a startup you may have never heard of, and a juggernaut that you definitely have heard of.
Groove’s neutral results
Helpdesk software provider Groove wanted to increase their conversion rates: more landing page visitors signing up for the free trial, and more marketing email recipients actually opening the email.
The startup experimented with different design and content tweaks. The changes included color and copy choices on website pages and subject-line text choices on marketing emails.
The changes Groove experimented with were not random ideas. They came from conventional wisdom and credible sources who found legitimate success employing similar changes. All sounded promising.
Groove individually tested their many design variations using A/B tests.
Most of them failed.
When failure looks like no difference at all
Failures aren’t always dramatic.
Sometimes you test out that new design you think will work better… and it works just about the same as the old design. It’s not a spectacular failure, it’s just “meh.”
Groove experienced a lot of these shoulder-shrug “failures”. Company founder Alex Turnbull straightforwardly calls them “neutral results”.
Groove’s design testing included trying out different colors on call-to-action buttons, and listing prices in different suggested forms (e.g., “$15” vs. “14.99”). Groove tested these and other types of design variations, but often found the results to be inconclusive.
When an individual test has neutral result, it can be difficult to draw any meaningful conclusion.
For Groove, lessons arose from the aggregate of their design testing efforts.
Lessons
- Expect failures. For each winning result, you’ll rack up a whole bunch of failures.
- Don’t rely on “proven” design tricks. Tactics that worked great for other companies:
- May not apply to your particular audience
- May not work well in your particular design
- May have lost effectiveness over time
- Optimize your designs AFTER you’ve settled on a well-researched approach that is specific to your users.
Quote
From Groove Company founder Alex Turnbull:
[…] one of the most important lessons we’ve learned as a business is that big, long-term results don’t come from tactical tests; they come from doing the hard work of research, customer development and making big shifts in messaging, positioning and strategy.
Then A/B testing can help you optimize your already validated approach.
Netflix’s failed feature
Netflix had an idea for a new feature to add to their home page that would increase new user sign-ups.
The folks at Netflix are experts at A/B testing, so naturally they A/B tested their new design before rolling it out to the whole world.
It failed.
Undeterred, the product design team created variations on design and approach for the new feature. Over the course of a year they tried these variations out in four additional A/B tests.
They all failed.
What’s going on?
The feature in question: allow potential users to browse Netflix’s catalog of movie and TV offerings before signing up.
Netflix had very strong evidence to believe this feature would increase user sign-ups. Users were requesting the feature in significant numbers. The design team liked the idea, too. It made a ton of sense.
But in every test, the home page with the browse feature performed worse than the home page without the feature. The sign-up rate was simply lower whenever the new feature was available.
Over time, Netflix came to understand why the browse feature was a failure. This included learning more about their users and learning more about their own product. They were hard-fought lessons, but valuable ones.
Allowing potential subscribers to see what Netflix had to offer? At its core, this was still a good idea. But the “browse” feature was the wrong approach.
Part of the reason for this, the team learned, was that to really see what Netflix has to offer, you have to actually use Netflix. The customized experience of the service is the key, and that experience is not captured by impersonally browsing the media catalog.
The barrier to entry for potential users was already low: a free one-month trial. So the goal of the Netflix homepage should be to direct potential users to the free trial. The browse feature, in all of its incarnations, was actually a distraction from the conversion goal, no matter how it was designed.
In the end, a static background image that gave users an impression of Netflix’s breadth of offerings was better than a full-fledged browse feature. And directing visitors’ focus toward the free-trial sign-up was far more effective.
Lessons
- Users don’t always know what they need
- Fully understanding what makes your product experience special to your users will help you make better design decisions
- Don’t distract your users away from the goal
- Test your new design before rolling it out, even if you’re *sure* it will work
Quote
As Product Designer Anna Blaylock said about the string of tests: “The test may have failed five times, but we got smarter five times.”
Design testing: Better to try than not try
All in all, testing designs before rolling them out to all your users is wise, and relatively inexpensive:
- only a small portion of your user base see the experimental interfaces while the tests are being run, so any negative impact is minimal;
- despite the many likely failures, there are valuable lessons to be learned…
- let alone the occasional “big win” successes that make your efforts worth it.
Learn about how BetaTesting can help your company launch better products with our beta testing platform and huge community of global testers.