User research instructions

Instruction statement

This instruction provides best practice techniques for conducting user research, including: identifying and segmenting end-user groups; types of enquiry (surveys, workshops, interviews).

Exclusions

None.

Instruction steps and actions

Introduction

User research is essential to creating a website or application that meets people’s needs and performs how they expect it to. User research is a cornerstone of the user-centred design methodology because it enables evidence-based decision making, rather than relying on the personal experience, preferences and biases of designers or stakeholders.

The insights from observing users and analysing their behaviour inform:

  • who will use the website in what scenarios
  • the tasks they intend to accomplish
  • the optimal user experience design to support those tasks.

The difference between user research and traditional market research is that it focuses on what people do rather than what they say. It is more interested in their actual behaviour while conducting a task rather than their attitudes and opinions.

Research activities can happen at the strategy stage of a design project, during the design and development process and after launch to validate and evaluate the effectiveness and efficiency of the website. Each research activity produces certain types of results and has its own strengths and weaknesses. Ideally, you will perform a variety of user research tasks to answer different types of research questions as your design project progresses.

Research planning

Identify the target user groups of your website or application. Review the existing Digital and Customer Experience Strategy user research materials, such as the RMIT Personas and the RMIT Mental Model, to lay a foundation or augment your user groups.

  • Who are they?
  • What is their demographic profile?
  • What technology do they use?
  • What problems do they need to solve or tasks do they need to get done?

You may start the project with a rough sense of who the site needs to cater to but need to know more about what these people’s real needs are.

Start by writing down what you want to know. This may take the form of questions you want answered or hypotheses or assumptions you want to test. These research questions are not the questions that you may ask an actual user, but instead the issues, concerns or ideas that your team wants to resolve. They could range from questions about the users themselves, to how they use an existing website or what they need from a new website. For example:

  • How long does it take a user to complete form A?
  • What do people think is the most valuable information on page B?
  • What process do people follow when they need to find out about C?
  • Can users find page D easily?
  • What websites do people currently use to find out about E and which site do they prefer?

If you have a long list of questions, categorise and prioritise them, since you may not be able to conduct sufficient research to answer them all.

Next, consider what type of research activity would best deliver the data you need to answer your questions, given your time and budget.

The two main types of data gathering methods are user-centred analysis and usability testing. User-centred analysis is a range of techniques that allow you to explore and define the direction of a website or application. Usability testing is about evaluating the performance of a website or application against a set of criteria to identify problems.

You can collect data using direct and indirect techniques. Direct methods involve meeting with current or potential users one-on-one or in a group setting to observe them and ask questions. Indirect methods allow current or potential users to provide information remotely through surveys, email, job descriptions, social media or via support centre or customer service reports. The most successful design teams maximise their exposure to real users.

Pick the most appropriate data gathering method based on:

  • What type of data is needed: are you interested in task flow, the function of certain features, site performance, user preferences and attributes?
  • How many people do you need to gather data from: do you have lots of different types of user groups?
  • Are your questions exploratory or about specific design details?
  • Are there complex issues involved?
  • Is the data needed quickly?
  • Is the budget tight?
  • Is a lot of data required to convince stakeholders?
  • Is information on user errors required?

Table: Widely used user research activities

User-Centred Analysis

Description

Direct or indirect

Learn more

Field study

Visiting users in their natural environment to observe them

Direct

Focus group

Structured group interview to reveal attitudes and perceptions

Direct

One-on-one interview

Personal interview to ask questions about the user, their needs, problems and preferences

Direct

Contextual inquiry

Interview users and observe tasks and context in their natural environment

Direct

Survey

Collect demographic data and user preferences

Indirect

Support centre/customer service reports/community managers

Gather data on common problems with existing systems

Indirect

Email/social media

Gather data on common problems and poll demographic data and user preferences

Indirect

Web analytics

Measure, analyse and report web traffic data to understand current usage behaviours and performance

Indirect

A/B testing

Present two design options simultaneously to users and measure performance of each

Indirect

Performance-based tasks

Users perform tasks that are timed or measured in some way.

Direct

Task walkthrough/ think aloud protocol

Users think out loud as they perform a task on a website.

Direct

Card sort

Users sort website topics into groups to inform information architecture.

Direct

Tree test

Users use a text version of the site structure to evaluate the findability of topics in a website.

Direct

Subjective rating

Users complete a questionnaire to indicate their reaction, opinion or priorities.

Direct

Heuristic evaluation / expert review

Evaluators assess a website’s compliance to established usability principles

Direct by proxy

Cognitive walkthrough

Evaluators perform user tasks to assess a website’s ease of learning and understandability.

Direct by proxy

Eye tracking

Measures where users gaze and fixate on a website while performing tasks.

Direct

Digital and Customer Experience Strategy can consult to recommend appropriate user research tasks for your project, if a user experience specialist or usability consultant has not been budgeted for your project.

Conducting the research

Regardless of the techniques you are using, keep these ideas in mind while the research is underway.

Decide upfront on note taking, synthesis and analysis methods

Before you are saddled with reams of data to make sense of, decide on who and how you will capture the findings and what methods you will use to synthesise and analyse the data. The research activity may define them to a large degree but you will have a choice in how scrupulous you need to be.

  • Do you have to share the results with a large number of people?
  • Do you need to write a formal report?
  • Does the data need to exist beyond the project?
  • Will the research be replicated at a later date?

It may be sufficient to scribble notes on butcher’s paper for immediate consumption, or you may need to record details in a spreadsheet to generate charts for a presentation. Prepare these materials before you begin.

Stay impartial

Enter the research phase with an open mind. Avoid biasing the findings by trying to stay objective and not letting what you expect or hope to see interfere with what you are actually seeing or not seeing.

Discuss as you go

Making sense of a considerable amount of data at the end of a research activity can be overwhelming and confusing. Make time to meet with others to discuss what you are witnessing or what you have discovered at regular intervals. This way you can chunk down the material, get input into your approaches and perspective on what really matters or is of interest.

Keep stakeholders informed

Take stakeholders on the journey with you by updating them with the research progress. Engaging stakeholders at this stage makes the research work tangible, can trigger additional lines of enquiry to follow and whets their appetite for the final results.

You may wish to print out concepts or screens and display them somewhere prominent with observations called out, outline key findings in interim emails, or call short debrief meetings to share results.

If it is ethically appropriate to invite project stakeholders to observe user sessions, then do so. Witnessing user behaviour and comments firsthand is very powerful and persuasive and helps enlist stakeholder as UCD advocates.

Communicating research findings

The success of the user research depends on how much your team members and stakeholders understand the findings, believe them and know how to act on the recommendations. Communicating the results effectively is therefore very important.

Select formats that take into account the purpose of the document, your audiences and deadlines, and the documents that came before and will come after.

  • Quick findings bulleted in an email or document, annotated to wireframes or screenshots, listed on butcher’s paper
  • Detailed report with prioritised findings, thorough descriptions, severity ratings
  • Presentation
  • Prioritised findings and recommendations matrix
  • Workflow diagram

Concentrate on making the material easy to interpret and inspire through:

  • taking an inventory of potential content and prioritising it
  • compelling, concise writing
  • creating a consistent visual language for the document
  • images such as screenshots, charts, diagrams, video, photos
  • tangible suggestions realised through sketches and mockups
  • quotes from users
  • including positive findings as well as weaknesses, issues or threats

Outline the research methods briefly to build credibility with your audience (if required) or if the research will be replicated.

Consider the potential reactions of your team members and stakeholders to plan your communication.

  • An impatient executive will want the top 3 takeaways summarised at the start of a meeting or report.
  • Don’t let a data lover derail a meeting by plunging into detail. Rather, pre-brief them and agree to follow up on specifics at a later date.
  • Take on sceptics (who may not be on board with the research or be personally challenged by the findings) by starting with aspects of the results that do match their worldview to build faith and open them up to hearing more.
  • To support the cultural shift towards UCD and data-driven decision making, refer to the research frequently and insist on evidence to back up perceived problems and design suggestions.

[Next: Supporting documents and information]