Sometimes usability testing drives me bananas.
We’re testing things we already know. We’re proving common sense. We’re spending lots of time and resources to get the answer we already know.
If you’ve spent any time observing usability tests, you know what I mean. We, meaning all of us humans, know what’s right a lot of the time. We know what’s good to do for people and what’s not so good.
This is Jeannie after a “fruitless” testing session.
Here are a few things I’ve observed in testing which needed no test at all, in my opinion.
- Should we force users into an automatic payment plan with no cancellation options in a deceptive way? (The answer is always no.)
- Should we overcomplicate the offer?
- Should we make our expensive graphics the star of our site and bury the tasks users want?
I’m taking some liberties here to prove a point. While these were not the official test questions, they might as well have been.
Brands become so obsessed with themselves they lose sight over what is really the point.
User experiences should be clean and straight-forward. Customers don’t want to “hang out” on your site. They want to do something there. And what they want to do it from their tablet, smart phone or smart watch might be totally different than what they want on a big monitor from their PC.
We know this!
And yet when asked to see “what users want” we somehow lose our minds.
A site aimed at an older demographic once tested if a higher contrast, easier-to-read font would make a positive impact. (Spoiler alert! It did!)
A site aimed at millennial customers tested if mobile-friendly options were worth the investment. (Shocker! They were!)
Here’s what I find so baffling:
Customer experience overall is full of these types of decisions. And yet I see perfectly capable and smart leaders want to overanalyze even the most obvious choices. Data is great to help guide decisions, determine long-term strategies, and help confirm direction.
But data is always about the past. And gathering and analyzing data takes time. Even with big data and real-time feedback, it’s a process. So if we already know what’s right, why don’t we act on it?
Leaders get in their own way.
They think they can’t move without data. They can’t make a decision without a colorful pie graph that confirms a majority of people think the way they do. It’s holding them back.
Innovation happens quickly. Ideas generate more ideas. Let ideas lead and see what happens. Stop testing the obvious.
I’m not saying to ignore data or never test ideas.
It’s important to test and gather feedback. But why spend resources where we already know the answer? Here are some things I’d argue never require testing.
- If it’s the right thing to do, then do it. If you all nod along in a meeting how “customers hate that,” then it’s time to fix “that.”
- Simple processes trump complicated ones. Don’t force users through 35 steps when it could be reduced to 5.
- It’s always a good idea to reassure customers. Digital experiences are full of moments when a customer might think “I hope this works/is worth it/doesn’t cost me later.” Reassuring users in each step of the process helps proactively avoid abandonment later.
Customer journeys are full of digital destinations which we can obsess about testing. Go for it! Test what needs real confirmation, but don’t be afraid to lead by doing what you know is right.
The post Testing Digital Experiences: How You’re Doing More Harm than Good appeared first on Customer Experience Consulting.