About Drunk User Testing
Last week Bedrock Data took Fusion for a night on the town to an event called Drunk User Testing. Hosted by Appcues at Wistia's beautiful office nestled in the leafy side-streets of Cambridge, Drunk User Testing was a smash hit. Along with eleven companies — startups mostly — we Bedrockers were heartened by the golden opportunity to put our product in the hands of real users.
At the gathering was a bonanza of things to test (besides pigs in a blanket): ezCater's newly launched iPad app, Chrome extensions for peer-to-peer tipping, a friendly gas-saving service named Gasbuddy. In just a few years, Drunk User Testing has helped a lot a companies improve how they craft their UI/UX.
While some results always beguile a sure conclusion, it's a fact well-known that when groups of whip-smart strangers assemble, few can resist the epicurean mélange of salt, bread, and imitation pork.
The best part is that after even just a couple of beers, folks at the booth got brutally honest — though they weren't spewing bile about petty things like pill-shaped button colors. Instead, we got visceral reactions about value prop, their yen for a simple information architecture, clear copy, and which functionality our product could chuck out the window.
So whether you are on the cusp of launching a product or deep in the trenches, here are a few lessons Bedrock learned at Appcues' Drunk User Testing, and strategies for applying them to your own product launch.
The Value of Practice & Scripts
Consider Ben Franklin. Specifically, the adage: “By failing to prepare you are preparing to fail.”
For demos, by failing to practice, you are practicing failure.
Lesson: Before a customer interview, conference, demo, or event, write a script for your user test. Include your big goal. Then sketch the steps users will take to achieve it. List the ideal flow in order, coupled with the desired user behaviors and screenshots if necessary. Always practice a couple dry runs before you talk to users.
Example: Our big goal was to test how smoothly Fusion prospects could become Fusion users, pinpointing the processes of:
- Signing up for a Fusion account
- Selecting their first connector (e.g. Marketo, HubSpot, NetSuite)
- Authorizing it to grant API access
- Understanding what to do with their cloud data warehouse credentials once their warehouse had been built
We knew we'd need to explain these bullet points simply. In plain English. After all, most people have little idea who we were or what we did, let alone how what we did worked.
Thus we kept our script short, sweet, and conversational. After asking their name and where they worked, we inquired whether people used any cloud applications, citing examples ("by that I mean CRMs like Salesforce, or marketing automation applications like Marketo or HubSpot.").
Crafting our script around the user's life helped us explain the product. We could then say, "Well what Fusion does is ingest data from applications like those. It also cleans the customer data and puts it in a cloud data warehouse that you can analyze in any business intelligence tool. All your customer data is in one place. And don't need to do any data prep. Any questions?"
Then came the test:
"So now we're going to start you off on our new website and test three things.
1. Creating a Fusion account;
2. Connecting 1 system to Fusion; and
3. Building a cloud data warehouse
The test should only take a few minutes. Is it okay if we record your feedback?”
Notice how repetition lets the concept sink in. When user testing, say things twice. Especially if your users have knocked a few back.
Defining Your Test
Our test broke down into eight steps, since we didn’t want too many or too few.
The test started at our Fusion landing page and ended on the warehouse credentials screen (below) users plug into their analytics tool.
When testing your product, have a clear start and finish.
Every step in our test included the page's name and its URL, along with the steps to complete. For instance, here's how we tested our signup form:
Try Fusion Form
- URL: https://www.bedrockdata.com/fusion/get-started-now
- Desired user behavior: Enter email address, current systems, click “Get Started”
Such a deliberate outline preempt snags and ensures a hiccupless test.
Take how Fusion users authorize their applications (i.e. connectors). Because we knew almost nobody would remember their login credentials off the top of their head, before testing we built a demo connector, whose purpose was to show the steps of authentication while freeing users from having to recall usernames and passwords.
With this demo connector, we also made sure that anyone could log in using sample demo credentials we created ahead of time.
And once they "logged in" to the demo connector, it showed up as an actual connector would. This way, users could envision the start-to-finish UX of the actual product.
In short: don't just wing your user tests. By failing to prepare, you're preparing to fail.
By preparing a detailed user flow, we'd set ourselves up for success to collect more granular feedback.
For example, the lead capture form below really tripped people up — much to our surprise — in field #2. For nearly every tester we spoke to said, “What does ‘Current Systems’ mean? Databases? Do I type something? I don't get it.”
These conversations usually prompted a discussion that led us to tweak the more bewildering “Systems” copy by changing it to read “Applications”, with the form's field filled in with a few examples, grayed-out, to jog users' memory about their own applications.
Such changes to your UX might seem minor at first blush. But when you're launching a new product, every detail matters as much as every new customer. Strive for clarity.
Documenting & Sharing Feedback
Memory’s a fickle a mistress. And if our beliefs warp, imperceptibly, over time and with the accumulation of experiences, neglecting to document product feedback endangers the very insights you worked so hard to glean. As your recall dwindles, you may, intentionally or not, then fabricate what folks had said, twisting user feedback into a vanity loop of what you'd have preferred to hear.
Lesson: Before you get on the horn or conduct a customer interview, make sure you have a point person whose sole job is documenting feedback. Record screencasts. Create Google docs, spreadsheets. Putting someone in charge lets others on your team to focus on running the user test themselves.
Example: At any given time, Bedrock had two or three people talking to users and one person recording what they said. Wrapping up two hours later, we had documented the feedback of roughly 35-40 individuals.
For us, mere bullet points sufficed. For instance:
- Thought authorization should be on the manage connectors screen: “Why should I leave the app to log in?”
This way, when we sifted through our notes later, the opportunities for improving the UI were far more evident than had we given a demo and recorded feedback simultaneously.
Improvements: With feedback at our behest, sharing them with others afterwards was easy. Together we could cull patterns with our product manager and UX designer, who quickly got to work on a prototype of our next release.
Jobs to Be Done
The world doesn't need more software, doodads, or gizmos. It needs jobs to be done.
As Harvard Business School professor, Clayton Christensen, explained: "Most companies segment their markets by customer demographics or product characteristics and differentiate their offerings by adding features and functions. But the consumer has a different view of the marketplace. He simply has a job to be done and is seeking to 'hire' the best product or service to do it."
In other words, a JTBD is not a product, service, or a specific solution; it's the higher purpose for which customers buy products, services, and solutions.
For user tests and interviews, use the JTBD framework with your own product. Ditch the features and talk about your product as if it were an awesome employee whom you would refer to anybody. Such a technique will help you in your product design and explain to users why they would want to ‘hire’ your product to make progress in specific circumstances. Ask yourself: how does my product create a “better me”? People love Spotify not because it plays music. Rather, during the era of vinyl records, CDs, and mp3s, the act of shopping, buying, storing, organizing, retrieving, and discovering new music was a big job to be done. Such a big job that the whole world was willing to pay Spotify to do that job for them.
Example: This is how we view Fusion. Its job to be done are:
1) The extract-transform-load (ETL) process of getting data out of siloed systems and into one easily accessible place, a cloud data warehouse.
2) The exhausting and boring job of data prep, or data wrangling, that ensures data is ready for analysis
These JTBDs are more than a huge drag. They cost companies hundreds of thousands in revenue and hundreds of hours in labor that could be better spent on analysis. People 'hire' Fusion to do the complicated and time-consuming job of getting customer data out of the original applications, making sure they're in the same format and using the same schema.
During our user tests, people got it when I said, “Imagine there was a database engineer who did all your ETL, data prep, and warehousing in a couple hours. He even fuses customer records so you don't have duplicates or empty fields. Basically, he makes your analysis way easier."
When asked if they'd hire this person on the spot, one user said, "Oh definitely. I'd give that guy a signing bonus.”
Improvements: After Drunk User Testing, we abandoned our obsession with features. Now we talk about how Fusion makes people’s lives better than they would without it.
When testing your product, think about what is, and isn’t, important to customers.
The Big Takeaway:
Feedback is essential to the evolution of your product. So whether you’re interviewing customers about the problem you've set out to solve, or your core idea has matured into an MVP design, you're only just getting started. Now is the time to ask prospects and current customers as many questions about the user experience, architecture, and to focus on the job to be done.
Now that's not to say you can't have any fun improving your product. In fact, as Drunk User Testing proved to us, the more fun you have, the better.
Interested in seeing our Fusion product improvements in action? Sign up for a free 14-day trial today.