Skip links and keyboard navigation

Usability testing

Prep time:
6-8 hours + participant recruitment
Run time:
Up to 1.5 hours per session
People:
2-3 (incl. 1 facilitator)
Contributed by:
Digital Service Design Office, Queensland Department of Transport and Main Roads
Stages:
AlphaBetaLive

The most effective way of understanding what works and what doesn’t in a digital product or service is to observe people using it. When the right participants attempt realistic activities, you gain qualitative insights into what is causing users to have trouble. These insights help you determine how to improve the design and functionality of your product or service.  

When doing any type of usability testing, it is essential that the people you engage with are real users of your service or product. When you actively engage users linked to a service or product, you can build more effective solutions that target true user needs.

You might have heard of other ways to describe ’usability testing’. It can also be called ‘user experience (UX) testing’ or ‘usability testing’. However, it is not the same as ‘user acceptance testing (UAT)'.

Outcomes

  • Evaluate the usability of your product or service
  • Identify areas of improvement
  • Verbatim notes ready for synthesis

What you need

RemoteIn-person
  • Video conferencing with screen sharing
  • Tool for recording the session (e.g.  the ‘record meeting’ feature in Microsoft Teams)
  • A way to take notes (Use the notetaking template)
  • Meeting space
  • Consent forms (if required)
  • A way to take notes (Use the notetaking template)
  • Recording device like a smart phone or voice recorder (See if your department has equipment available for loan)
  • A device for the participant to use if they are unable to use their own

Tip

light bulb IconIt’s not a good idea to record user tests on your own device. You might capture personal information about the participant during the usability testing and there are strict rules about managing personal information on your device. It is best to avoid this.

Instructions

Most plays in this playbook involve minimal preparation. But the usability testing play is different. Doing the prep work thoroughly means that you’ll get far more insights out of the usability testing sessions.

Choosing the right play

Usability testing is the best method for answering the question: Can people use this thing? Is there an existing service or product, or even prototype that you need to gain feedback around? If so, this is the play for you.

If you need to do more exploratory user research around users, the user interview play will be more useful for you. The purpose of generative user interviews is to explore the user’s world, uncover their challenges, and identify opportunities for improvement. Unlike usability interviews that assess existing products, services or designs, user interviews are usually conducted in the early stages of the design process to inform ideation and concept development.

Moderated vs unmoderated usability testing

This play is a moderated usability testing play. Moderated and unmoderated usability testing are two distinct approaches. The main difference between the two lies in the presence or absence of a moderator during the testing process.

Moderated Usability Testing

The moderator interacts directly with the participants, providing instructions, asking questions, and observing their interactions with the product or interface. This approach allows for real-time feedback and enables the moderator to probe deeper into the participants' thoughts, impressions, and experiences. Moderated testing sessions are typically conducted in-person or remotely through video conferencing.

Unmoderated usability testing

Unmoderated usability testing involves participants independently completing usability tasks without direct interaction with a moderator. Participants typically use an online platform or software that guides them through the tasks, records their interactions, and collects their feedback.

Decide what to test

Before you can write your usability testing guide, you first need to work out what you’re going to test and what you want user feedback on. Here are a few questions that can help you decide what you need to test.

  1. Are you testing an existing part of a service or product in its current state?
  2. Are you testing a prototype?
  3. Are you testing two competing designs to decide which one to implement?
  4. Do you need to test the accessibility of a product or service?
  5. Do you want feedback on readability?
  6. Do you want feedback on the visual design?
  7. How many rounds of testing do you want to do?

The answers to these questions can help you find the right users to test with and write a usability testing plan that will give you relevant insights.

Prototype vs existing solution

Existing solution

If you’re wanting to test the existing product or service, you can use either the UAT (user acceptance testing environment) or production environment depending on the service. If the task involves a transaction you will need to use the UAT, and if the task involves reviewing information, then production can be used. If you don’t have a suitable environment to test an existing solution, you may need to create a prototype of the existing solution to use during testing.

Prototype

If you’re testing a prototype you need to make sure that the tasks you’ll ask the user to perform can be completed using the prototype. Prototypes can be low-fidelity or high-fidelity. Examples or low-fidelity prototypes could be paper prototypes or wireframes, and high-fidelity prototypes could be something that has been produced with digital prototyping tools such as Figma.

Form a usability testing team

There are 3 key roles when it comes to usability testing: The facilitator, scribe, and silent observer.

The interviewer

Anyone can participate or facilitate usability testing, but typically a Project Manager (PM), Project Officer, Designer, Business Analyst, or Communications Officer fills the role of interviewer. Don't try and be both the interviewer and scribe – you'll get distracted and won't be actively listening.

The scribe

You'll also want another person from your team along to play the role of scribe and generally be your co-pilot. The scribe is responsible for taking verbatim notes of the participant during the session. It’s important to take verbatim notes to avoid paraphrasing which typically results in diluted and biased data. Verbatim notes are word-for-word notes on exactly what customers said during the interview. Verbatim notes should be captured digitally by the dedicated scribe.

Download our notetaking template to see an example.

Tip

The most effective way to take verbatim notes is by using Excel. Treat each cell like a sticky note (one concept per cell) and use the ‘return’ or ‘enter’ key to quickly move to the next cell. This is much quicker than adding notes directly into whiteboarding tools like Miro. When you’re done, you can copy your organised notes from Excel to Miro easily, if you wish to synthesise your notes there.

Silent observer – optional

Including a silent observer is a great way to make sure you don't overlook any potential insights. Their job is to absorb the conversation and listen for any themes, behaviours or connections the interviewer and scribe are too busy to catch.

Tip

A good way to build interviewing skills is to be the scribe or observer for an experienced interviewer.

Find the right users to test with

Sounds obvious, doesn't it? But don’t rush through this step! Consider your goals for the usability testing and make sure you're recruiting real users. Did you uncover information about who your target users are during the initial stages in your project? Use this to inform who you want to recruit for testing. For more guidance, consider the following plays:

Aim for a balance of user types within your identified user group. This could include a mix of age, gender, digital literacy, general literacy, education, employment status, geographic location, accessibility needs, and what services they engage with or other characteristics specific to your project.

For each round of usability testing, aim to test with at least 5-10 people to ensure that you are covering a diverse group. Studies show that testing with 5 people will help you uncover roughly 90% of usability issues. It's important to note that this assumes you will do multiple rounds of testing with a variety of users and iterate the product or service as you go. Usability testing is not a once-off activity, you will get the best results by doing many rounds over time. Consider how you will make usability testing a part of your business-as-usual product or service improvement plans.

Choosing the best location to run the session with your users

If the session is in person, you can invite the user into a work meeting space to conduct the session. Try and choose a room that is available to the public, is reasonably inviting, is accessible, and has good transport options.

If an in-person session isn't possible, a video call with screen-sharing functionality can work. You can set up a Teams meeting and invite them via email. Make sure to include any scribes and observers on the invite.

Recruit your users

Identify your target users with screener questions

Screener questions in user testing are a set of questions used to filter and select participants who meet specific criteria for a usability study or user research. These questions are typically asked before the testing session to ensure that the participants possess the desired characteristics, demographics, or experience level necessary for the study.

Screener questions are often included in the participant recruitment process. When seeking participants for a study, researchers can use screener questions in surveys, online forms, or phone interviews to filter and select individuals who match the desired criteria. This allows researchers to assemble a suitable participant pool before moving forward with scheduling and conducting the usability interview sessions.

The specific screener questions will vary depending on the goals and requirements of the user testing project. It is essential to carefully design and select these questions to ensure the recruited participants align with the target audience, allowing for meaningful insights and actionable feedback.

There is a screener section in the usability testing plan template. You can find further examples of how to create a screener for participant recruitment in the customer research section of the Queensland Government HCD resources.

Recruitment process

You will have to go through one of the participant recruitment processes to get them to come along to a session. This might be making use of agency-specific customer panels like Transport Talk, industry user research platforms like Askable, accessing Queenslanders with a disability via the QEngage partnership, or using a market research company to find and schedule them into your sessions. These may or may not require a Request to Approve Contactor (RTAC) or other internal approvals.

Consider participant consent when planning your recruitment. What are you seeking their informed consent for – will the research data only be used for this project or potentially reused later? Think about when in the process you will seek consent. You can do it during the screening and recruitment process, or you can do it at the beginning of the testing session. There is an example of a participant consent form in the usability testing plan template below.

Recruitment may take up to 6 weeks once the approval processes are factored in.

Contact us if you need guidance through the participant recruitment process.

Arrange your usability testing sessions

Once you have your approach approved, you will be able to arrange the usability testing sessions.
Make sure to include buffer time between the sessions (and don't forget a lunch break, usability testing is tiring!). This will allow you to do a quick debrief with the team and reset for the next session. If you are doing 1-hour sessions it's likely you will get 4 or 5 in per day.

Write a usability testing plan

Background and warmup questions

Before jumping into the usability tests, it’s always good to ask some general questions about your participant in the context of the product or service you are testing. This not only acts as a warmup for the participant, but it will help you to better understand the participant and empathise with their perspective.

For example, if you were looking to improve the user experience of transferring vehicle registration online, you might ask background questions like this:

  • Why do you need to transfer ownership of your vehicle?
  • Have you transferred ownership of a vehicle before?
  • What’s your current understanding of what you would need to do to change the ownership of your vehicle?

Tasks, scenarios, and user goals

In order to observe participants, you need to give them something to do. These assignments are frequently referred to as tasks (during testing it’s good to call them ‘activities’ to avoid making the participants feel like they’re being tested).

It’s good practice to give participants a short scenario that sets the stage for the activity and provides some context around why they are doing the task.

Before you write your list of tasks, first write a list of user goals that users may have. Ask yourself: What are the most important things that your users must be able to accomplish in the context of your project?

Sticking to our example of transferring vehicle registration online, users must be able to:

  • Learn more about transferring vehicle registration process
  • Understand what they need to transfer vehicle registration
  • Complete the online transfer registration form online

Once you’ve figured out what the user goals are, you can create task scenarios that are appropriate for usability testing.

Tip 

Focus on the users’ goals and how we can support them in achieving their goals. It’s important to focus on your users’ goals and not the business goals.

Writing task scenarios

A task scenario is the action that you will ask the participant to take on the interface you are testing.

Some example task scenarios could be:

You’ve just sold your car to another person on Carsales.com.au. To finalise the transaction, you need to transfer the vehicle out of your name to the buyer. Go to the TMR website and submit an application to transfer the registration of your vehicle so you can finalise the sale of your car.

When writing your task scenarios keep in mind the 3 golden rules:

  • Keep them realistic and typical of how people would use the system in their own time
  • Keep it simple. Don’t over complicate the scenario or give detailed instructions.
  • Don’t give away the answer!

Get ready for usability testing

You’re finally going to test with your users – yay!

Here’s a checklist you can use to ensure you’re prepared:

  1. You have copies of your questions and task scenarios for everyone (either printed or open in another window of your device)
  2. You have information about your participants handy, including their name and their answers to the screener questions (A screener survey is used to weed out people who aren’t in your intended audience. You can use their screener answers in the session if appropriate).
  3. You have documented participant consent obtained during the recruitment stage or a participant consent form to be completed at the start of the session
  4. The usability testing room (if in person) is tidy and has a relaxed vibe
  5. Your background (if online) is tidy, blurred, or you are using an officially approved image.
  6. Your recording device is charged (if not using a recording feature in your video conferencing tool)
  7. Make sure your testing team understands their roles so you can operate like a well-oiled machine. Scribes should focus on taking verbatim notes, and observers should ask questions only at the very end.
  8. Make sure you have allocated time for someone to save and store the session recordings. Sometimes it can be slow to transfer between Teams sites or internal drives so others can view them.

Now for the actual testing! Think of the following steps as suggestions, and feel free to tweak them to suit your needs and your style.

1. Housekeeping – 5 mins

Welcome the user and thank them for coming (again: sounds obvious, but it's easy to forget!). If you are conducting the session online and will need them to share their screen, get them to test this now.

Ask if it's ok to record the session so you've got everything verbatim in case the scribe falls behind a bit. Assure them the recording, and the session in general, is only for internal use and capability development. You'll only share this internally with government but will keep it confidential otherwise – e.g. you won't be pulling quotes for advertising purposes and if it is shared internally, they will be de-identified.

2. Warm up questions – 10 mins

Lead with the warm up questions from the usability testing guide you prepared. Part of the foundation you're laying here is establishing rapport. Pay attention to cues like body language and tone. Does your user feel at ease?

3. Prime the participant – 5 mins

Let your participant know that you will be giving them a scenario and a few activities to complete. It's important to make them feel at ease before they attempt the tasks to ensure they provide honest feedback. Here are some points to help guide you:

  • “We’re testing the design, not you. If something doesn’t make sense let me know. It means we need to make improvements to the design.”
  • “None of us here today had any involvement with the design, so don’t feel like you need to hold back with any negative feedback you may have.” (This is a great tip, even if you were involved in the design!)
  • “Please think out loud so we can understand your thought process.”
  • “If you have any questions during the activity, I may not be able to answer them. I’m here to see how you would use the service (or insert the name of your product here), but I'll let you know if something isn't working because it’s a prototype or under development. For example, if you ask "what does this button do?" I might ask you "what are you expecting it to do?"

4. Task scenarios – 20 mins

Once your participant understands they can be open and honest, you can introduce them to the scenario and tasks. Remind the participant to think out loud, and don’t be afraid to remind them again if they are silent while completing the task.

Remain mostly silent while the participant is completing the task. If you would like the participant to better articulate their thought process, you can ask prompting questions. Try and use open-ended questions so you can avoid lots of yes or no answers:

  • “Can you tell me more about that?”
  • “What makes you think that?”
  • “What do you think will happen when you click/tap x?"

If participants have questions, remind them that you can’t answer, or throw the question back to them. If a participant asks ‘What does this button do?’, you could say ‘What do you think it does?’. Only help the participant if they are stuck and can’t work out how to proceed, or if there is an issue with the product you are testing (like a prototype not working).

5. Closing questions – 5 mins

This is your chance to ask questions you didn’t get a chance to ask during the task scenarios. You may also ask some general questions at this stage like, ‘How would you describe your experience using x feature?’ or ‘How could x feature be improved?’.

6. Observer questions – 10 mins

If you have observers in the room, and they have additional questions, now is the time.

The questions should prompt deeper thought and reflection. “Is this an important feature?” could be phrased better, like: “Why is this feature useful?”. “What would happen if this feature disappeared?” is even more valuable – it'll either sharply illustrate the feature's value, or make the user realize that their life wouldn't change much and maybe the feature isn't so critical after all.

7. Turn the tables – 5 mins

We get more out of usability testing sessions when we turn them into a two-way conversation. Before the session winds down, give your user a chance to ask questions of you. There’s something else they might want to tell you or have questions about the prototype or your approach. You might ask “Do you have any questions for us, or is there anything else you’d like to tell us?”.

8. Wrap it up – 2 mins

Thank the user for their time and reiterate how they will receive their incentive.

9. Team debrief – 15 mins

After you’ve parted ways with the user(s), gather your team for a quick huddle. Ask what they thought about the user’s feedback: was it useful? Surprising? Contradictory? The goal of the debrief is to discuss the key takeaways and help the information to sink in. Document the key takeaways so you won’t forget them.

After all your usability tests are complete you need to synthesise all of that qualitative data you just collected. A play to help you synthesise your data is coming soon to this playbook.

Resources

See below for a collection of templates and other pages which will help you run this play. These resources are also linked in the play instructions.

Participant consent form (This can be read out during the session)

The stages

The four stages of the Service design and delivery process are Discovery, Alpha, Beta, and Live.

The stages of the design and delivery process

Share your feedback

Take our short feedback survey and tell us what you thought of this play, or report an issue.

This playbook is a beta product, your feedback helps us improve it for everyone.

Contact us

If you need advice, mentoring, or guidance on how to use the playbook, or you’d like to contribute to the playbook, you can contact us.