User testing at startups

First let’s set some boundary conditions for this post. It is meant for early stage tech startups with limited resources embarking on their first usability testing experiment.

Yes, usability testing is important. But performing usability testing at an early stage startup is hard. Best practices from larger companies with dedicated product managers do not carry over easily. We learned this when we rolled out a structured usability testing exercise at Filepicker.io

Our goal: reduce friction in developer onboarding on the website. Here is what we learned:

#1: Startups are resource constrained. Think minimum viable process.

usability testing best practices advocate an exploratory, open ended approach. There are several awesome resources such as this one that can guide the experiment.
So when we started the experiment, we wanted to understand if developers had enough information on:
a) How to start using filestack.com in their relevant language/platform
b) Whether the steps are specific enough to setup a simple integration
c) Whether they feel comfortable with the sample code
d) What other actionable items are required to go live with the service

But when we asked ourselves what we would do with the answers to these questions, we realized that we wouldn’t do much. Rather, we couldn’t do much.

Why?

None of the responses would help us determine where to draw the line on follow up actions. For example: how much guidance should we provide till a developer feels comfortable with the sample code? Or how many of the actionable suggestions should we implement?

While the open ended questions certainly yield a wealth of knowledge, what use are they if they can’t be implemented? Startups always have a sh*t-ton of work and, as it always feels, there are never enough resources. With limited resources to go around, all nice-to-have or would-be-great-to-have tasks seem like luxuries and the focus always remains on resource maximization to grow the company as fast as possible.

Therefore, the better approach was to limit the scope the experiment to prove or disprove some specific unit of measurement for a specific customer persona. Thus, we redefined the learning objective as:

“For customer persona X, it’s not possible go through the entire onboarding flow without reaching out to support@”

This allowed us to quantify and prioritize the feedback that we received from the usability testing. Initial usability testing (we need to do this for a statistically significant sample) proved that developers could finish the onboarding flow for a simple use case without reaching support (phew!). This meant that we WOULD NOT have to prioritize improving the website right away. The feedback we got from the exercise has been added to our icebox as nice-to-have rather than a must-have.

#2: Take responsibility for the entire user experience including third party products

Sometimes the problem in the onboarding flow might not lie in your product but in the handoffs occurring between your product and other third party products.

In our case configuring the AWS S3 settings is a critical step in the developer onboarding flow. When we reviewed the recording from the Silverbackapp we realized that developers were getting lost in setting the IAM policies. While we provide links in our developer portal to the AWS support page on this topic, it was obviously insufficient. Though they eventually figured it out, there was quite a bit of toggling between screens to understand what needed to be done.

It was clear to us that if we wanted to provide as frictionless as an experience as possible to the developer, then we had to take ownership of that experience as well. So we made a screenshare of how to configure the IAM permissions in S3.

In conclusion, think minimum viable process for user testing. If you are a startup, be wary of adopting best practices designed for large companies. Open ended usability testing may not be the most optimal thing. If your onboarding flow involves third party products then watch out for friction in the points of handoffs.

One of our primary goals at Filepicker.io is to create a best in class developer experience, guided by tests like this one step at a time. Try us out and tell us how we’re doing.

-Chintan & Anand

Follow the discussion on HN

[Edit]-Per Chris Smeder’s comment on HN, user testing has been changed to usability testing throughout the article.

Unfortunately, I couldn’t change the title as it would break the url which has been reblogged and distributed.

Read More →