Usability Testing: Getting a Head Start
Project Background
After completing our initial onboarding steps, clients then need to schedule a call with our sales team.
During this call, clients work with the sales person to discuss our process, make sure we are the right fit for them, develop a draft of their job description and align on the specifics of the role.
Then the client goes on to a call with our matching team to discuss more specifics about the type of talent they are looking for before finally being sent talent.
We recognized that there were quite a few calls involved in this process and wanted to find a way to better support clients who wanted more self-serve options.
The team decided to provide an option for clients to skip the sales call by filling out a draft of their job description and talent needs. We called this the Head Start path.
The team wanted to move quickly so they opted to launch an A/B test with an initial concept. Here is what they learned:
When given the option of which path to take (Head Start or schedule a call with sales) 65% of clients chose the Head Start path.
However, only 36% of Head Start clients went on to book a call with our matcher, this was 13% worse than those who choose to schedule a call with sales.
The Decision Statement
What improvements should we make to the Head Start experience to improve the rate at which clients book a matching call with us after going through it?
Evidence Needs
In order to make this particular decision I needed to gather the following pieces of evidence:
What step in the flow are clients typically dropping off from?
What usability problems exist within the flow and which are critical to fix?
What is the context that clients find themselves in when they choose Head Start and what are their expectations for how it will work?
Research Methods
Now that I understood the decision that the team was trying to make and had identified the evidence I would need in order to make it, I could choose the methods that would help gather that evidence. I chose to:
Use Log Rocket to view recorded sessions of real clients going through the flow to determine the most common drop off point and spot patterns of behavior.
Conduct moderated usability testing to assess usability challenges, understand client’s context, and their decision making process.
Study Design
After reviewing the recorded sessions in LogRocket, I noticed that the majority of clients who ultimately dropped from the flow, did so within the first couple steps of the process. This meant that clients weren’t even experiencing the new flow before the decided to leave.
Based on this, I concluded that our usability testing needed to start from the very first page and include the entire process rather then just testing the new flow in isolation. This way, I could assess how Head Start fit within the entire context of the experience.
I worked with the design team to build a prototype of the entire end to end flow starting from our main landing page and included all of the onboarding steps clients would have gone through before getting to the Head Start flow.
I recruited 8 participants who matched our client’s profile and I conducted one hour moderated usability sessions with each of them.
At the end of the session, they were asked to subjectively rate their experience in three different categories
Their perception of the number of steps
How easy or difficult they felt the process was
How confident there were they completed everything correctly
Results
There were too many steps
It was immediately clear that the total number of steps clients needed to go through was a major issue.
Participants rated the number of steps significantly lower than ease of use or their confidence in their ability to complete the process (p<0.00)
Participants told us they felt like there were a lot of duplicate questions between the initial onboarding flow and the Head Start flow. For example, we ask them about the skills they are looking for in both the onboarding steps and the job description.
“I think I already gave you this with the attributes I was looking for. So I'm kind of annoyed by that.” - P7
It was difficult to understand our process
After going through the entire flow, participants were still uncertain about how our process works and what the next steps would be.
We realized that part of the value of the sales call is for clients to get a good explanation of our process. If you skip that call because you chose the Head Start process, the clients miss all of that information.
“I'd like to… to have better expectations set for me about exactly what's happening in this process and what the outcomes are.” -P5
Building Trust is Key
Throughout the flow participants looked for opportunities to build trust in our service. In some cases we were good at building trust, in others the flow fell short.
“Okay. This is a nice way to show that the company has a lot of different… relationships with different companies…” - P1
“…you hit me with the elements of trust at the beginning that you're good company... but as I was giving you information, I felt like I was getting nothing back and no validation that… you feel like you have someone who could meet my needs.” - P3
Conclusion and decision
The team decided to reduce the total number of steps by eliminating as many of the duplicate pieces of information as possible.
The team decided to include a clear process description and progress tracker so clients knew what to expect and where they were in the process.
The team decided to included more elements of trust building within our flow. For example, we provided details about how talent are screened and selected specifically for their role.
Finally, we included a prediction about whether or not we thought we had talent to meet their needs as they went through the flow.
Outcome
After making these improvements, we re-launched the experiment with the new version of the Head Start flow.
We saw a similar adoption rate with 65% of clients choosing to go through the Head Start flow and an increase in the conversion rate of the flow to 42%; up from 36% in the original design.
Statistically, this was no different than the conversion rate when a client goes through the sales call process.
Our conclusion was that this new process works just as well as our sales process but with much less overhead for us and our clients because we reduce the number of live calls. We are also better supporting clients who prefer more self-serve options.