Case Study

How to optimise online feedback prompts as part of a UX Strategy

My Role - UX Research Consultant at Kainos.

Responsible for leading all UX research activities within the product team.

Client - UK Department for Transport (DFT)

Kainos operates a web-based Street Manager service for the DFT. Street Manager enables local councils and utility companies to coordinate road works across England.

Project Goal - Understand the optimal methods and frequency for capturing user feedback in Street Manager.

Project Time Frame - Full time for two weeks

Project Summary

The Street Manager product team wanted to implement smiley face feedback prompts in the Street Manager system to collect ongoing feedback on how satisfied users are with specific features. I was asked to research the viability of this proposal.

I had two weeks to complete the research and formulate actionable recommendations. 

I held online research interviews with 5 users where I presented a variety of online feedback prompts to participants for discussion. 

I used the findings to create guidelines for an overall UX feedback strategy. This included:

  • A list of heuristics to apply when designing feedback prompts

  • A visualisation of optimal times to present feedback prompts

  • A model of motivation for interaction with feedback prompts

The results were very well received and I was asked to present the research to the whole company so that other areas of the business could utilise the recommendations.

The Problem to be Solved

Street Manager users are people who either work for local councils or utility companies and use Street Manager as part of their job. This discovery was about how the Street Manager product team could fulfil the needs of the organisation without detriment to the users. It is not a user need to be asked for feedback even though interacting with feedback prompts will benefit users. Therefore, I crafted a problem statement around the needs of Street Manager even though research sessions would be conducted with end users.

Problem Statement - Street Manager

The DFT needs a way to measure Street Manager user satisfaction and collect regular meaningful feedback without becoming an annoyance or noise that users ignore.

Without regular reliable measures of user satisfaction it is difficult to precisely identify what improvements need to be made to the user experience, what features to prioritise for development, and the impact of new features rolled out.

Hypothesis

1. Participants will have a clear preference for one or more specific examples of feedback collection methods and a clear dislike for others.

2. A 'Goldilocks' limit exists for how often feedback prompts can appear without stimulating a negative response

Research Method

I conducted one-to-one online directed interviews. Each session lasted around 45 minutes and participants signed consent forms to allow recording. 

The sessions took place in two parts. In the first part, participants were asked about previous experiences with online feedback prompts both within and outside of Street Manager. In the second half, participants were presented with various examples of feedback prompts and asked to talk through how they felt about each prompt and how they might react to the prompts appearing while working in Street Manager.

Examples of feedback prompts shown to participants included:

  • Pictorial eg smiley faces, thumbs up/down, emoticons

  • Rating Scales

  • Yes / No buttons

  • Text Links

  • Text boxes

  • Surveys

Research Aims

Part 1 - Past Experiences

  1. Understand the user’s feelings about feedback prompts in general

  2. Understand the context of when they have interacted with prompts or not in the past

  3. Mentally prepare them for the second half by thinking about the variety of forms feedback prompts can take and how they have felt and behaved in the real world when encountering them.

Part 2 - Specific Examples

  1. Understand which prompts users like or dislike and why

  2. Understand what would influence whether they engage with a prompt or not in Street Manager

  3. Understand their expectations around what should happen after providing feedback

Recruitment

The product team already had a list of potential participants on standby so I was able to recruit 5 users within 3 days.

Usually, I would prioritise location spread as a key variable when conducting user research for Street Manager because road types and transportation needs are different across England. But for this project, I prioritised job type because the tasks a user completes will probably have a greater influence on how a user responds to feedback prompts appearing during their work.

There was a spread of gender across participants and one participant was neurodivergent.

Limitations & bias

There were some limitations and potential biases that it is important to note.

  • All the participants held particularly positive views about Street Manager and this may have skewed the results with a predisposition for positivity towards all types of potential Feedback methods in Street Manager.

  • Ethnicity and nationality were not recorded but all participants appeared to be White British. Cultural background and English as a first language might have influenced participant responses to feedback prompts.

  • Overall, 5 participants aren’t enough to glean universal and comprehensive insights. I would have liked to include more users with accessibility needs such as colour blindness and neurodiversity, more users with English as a second language and users across a range of age groups as there are likely generational differences in how online feedback prompts are perceived.

Four Key Findings

  1. There are clear and specific motivators and demotivators that influence why people do/don't provide feedback

  2. The time of day a user sees a feedback prompt will heavily influence whether they engage with the prompt or not

  3. Users make conscious decisions about whether the question being asked is meaningful and whether their response will be used to add value

  4. The context of when a prompt is presented has a big influence on user engagement with feedback prompts

Hypothesis Validation

My initial hypothesis was only partially confirmed. There was no unanimous preference for a particular type of prompt but there were consistent preferences about the prompts users do not like and particularly around the prompts that they felt were inappropriate for a workplace system.

A Goldilocks limit does appear to exist for the frequency of prompts and users unanimously agreed on what constitutes ‘too frequent’. However, there was no clear consensus on what would be the optimal frequency for feedback prompts.

12 Key Points

1. Manage Expectations

Negative past experiences with feedback prompts, both inside or outside of Street Manager, leads to a general suspicion of all feedback requests. Users expect that feedback requests will turn into more and more additional requests which they dislike.

Knowing exactly what to expect when submitting feedback puts users more at ease

2. Provide a way to easily opt out

All participants stated that there should always be the option to easily dismiss feedback prompts. This is particularly important for prompts that appear in the centre of the screen and prevent the user from continuing a task.

There is a risk that users will automatically dismiss everything but if the prompt is not annoying and is easy to interact with, users are more likely to engage with it.

4. Make sure users feel appreciated and can see the impact of their effort

Seeing the impact of their feedback motivates users to give feedback on a regular basis. Users reported feeling good when they noticed a change they had a part in.

Users are more likely to give feedback again if they receive a subsequent update and if they receive a thank you or acknowledgement for their contribution.

3. Keep questions relevant with clear value to the user

When presented with a feedback prompt users make conscious decisions about whether the question being asked is worth answering. Users are more likely to answer a question if they judge it to be relevant and are confident that action will be taken based on their feedback.

Users are less likely to provide feedback if they feel it is a one-way street with nothing to show for the time invested.

5. Keep prompts appropriate to the environment they appear in

Preferences for specific types of feedback prompts depend on the context and need to be in keeping with the environment it appears. There was a strong feeling that emoticons are inappropriate for a professional environment and although most users liked smiley faces there was concern about whether smileys belong in Street Manager.

The device used can affect people's perception of the feedback method and smileys are regarded as more appropriate on a mobile device than a desktop or laptop.

6. Keep feedback prompts relevant to the immediate task

There was a consensus that any feedback prompts need to be relevant to the immediate task and when there is a delay in asking for feedback people are less likely to engage.

Users are least likely to complete a feedback request if it comes via email.

7. Eliminate ambiguity

Users are less likely to respond if it is unclear which specific task the question is about or if the phrasing of the question is ambiguous. Any scales used to measure feedback need to have clear indicators of what the points on the scale mean.

Users tend to equate the idea of giving feedback as negative. Participants suggested that if feedback prompts state that feedback can be both positive and negative, they would be more likely to leave balanced feedback.

8. Make it quick and easy to respond

Users are most likely to engage with prompts that make it quick and easy to provide feedback as long as it is relevant to the immediate task. This is particularly true of the busiest people. Participants often described themselves as "firefighting" so feedback is not a priority for them. Having one click feedback increases the likelihood of engagement from the busiest people

9. Balance prompt frequency with return on investment

There was no consensus on the perfect frequency of feedback requests but participants unanimously agreed that daily prompts are too much and after each task is too much.

Most participants said that monthly prompts seemed reasonable.

Overall, the optimal frequency depends on the return on investment. A higher frequency of requests will be viewed more positively if the user experiences equally frequent changes which makes their life better.

10. Make prompts noticeable

Most participants suggested that feedback prompts should be large and in the centre of the screen because it will be too easy to miss otherwise. This was backed up by the fact hardly any participants had ever noticed the current feedback prompt that sits in Street Manager today.

12. Utilise a human element

Most participants were familiar with the Street Manager product owner because he runs a Street Manager You Tube channel. The connection felt by users towards a familiar personality on the product team incentivised them to volunteer for research sessions. 

Having a human connection appears to be a powerful influence on user participation in giving feedback. Participants reported a preference for giving feedback directly to a specified contact, mainly because this made them feel listened to and that there was a greater chance of action being taken.

Participants reported feeling good when someone took the time to acknowledge or thank them for feedback and said this was a big motivator to provide feedback again in the future.

11. Keep prompts accessible

Beyond the common happy and sad smileys, it can be difficult for neurodivergent thinkers to interpret less common emoticons. To be more inclusive and to optimise the value of potential feedback it is better to limit visual feedback prompts to well established icons or include a written description with images.

A Model of Feedback Motivation

I placed motivation to engage with feedback prompts along a scale from resistant to eager. Previous experiences giving feedback combined with how a feedback strategy is executed will impact both the willingness of users to engage with feedback prompts and the quality of the feedback they provide.

At each extreme, some users will either always engage because they enjoy giving feedback or always avoid engagement because of previous poor experiences or lack of time. It will be difficult to motivate the most resistant users and trying to force them will likely result in poor quality feedback.

Every experience giving feedback influences whether a person will move from avoidant to eager and vice versa. A good feedback strategy will motivate avoidant people to become more engaged with feedback requests and move further towards the eager end.

An essential aspect of a good feedback strategy is good communication with users so that they know they have been listened to and what improvements are in the pipeline. Without this, some users will use feedback prompts for ‘frustration spam’ where they overload the system with repeated requests for the same thing or send abusive messages.

Model of motivation to engage with online feedback prompts

Perfect Timing

The time of day users see a feedback prompt will impact whether they engage with or dismiss it and whether they feel annoyed by the request. There was a consensus that a prompt for feedback should never interrupt a task in progress.

I was able to create a guide to rolling out feedback prompts in Street Manager. This could be empirically fine tuned and validated during testing.

Optimal times to prompt for online feedback

The Challenge of Designing a User-friendly Feedback Strategy

With so many specific aspects to keep in mind when designing feedback prompts, I wanted to create a tool that would make it simple for the UI designer to refer to and something that could be easily referenced by anyone seeking to maximise the impact of any future feedback strategy.

Ultimately there were 13 key points and I played around with a visual representation and the creation of an acronym: C.A.C.H.E. O.F. V.E.N.T.E.D.

In the end, I didn’t feel that these worked particularly well as there is too much context missing and it seemed unlikely that anyone would refer to them. I wanted to create a practical tool that anyone could use for Street Manager and beyond.

Examples of brainstorming how to present the new feedback strategy in a usable visual format

Feedback Strategy Heuristics - A Practical Tool

I drew inspiration from Neilson Norman's usability heuristics and created a checklist of heuristics to work through when designing feedback prompts and an online feedback strategy. I set up a spreadsheet that is split into two sections: strategy and design because optimal results of any feedback will depend on both these elements.

The spreadsheet includes question prompts for each heuristic and recommendations for things to do and things to avoid.

The Feedback Strategy Heuristics sheet can be accessed here.

The UI designer used this to design a prototype for user testing and confirmed that he found this very useful to work through and confirm his designs were in line with what was learned from the research.

Tool for designing a UX feedback strategy

Screenshot of the heuristics spreadsheet

Outcomes

My time on the project finished before I could conduct testing but I completed a handover with the team. This included a testing plan to validate whether users behave consistently with how they said they would. Once testing is complete, feedback mechanisms will be implemented into the Street Manager system to inform future UX improvements and areas of focus.

Before I left, I presented the results to the dev team, 1000+ Kainos employees at a show and tell and had 1:1s with the product team UI designer. The show and tell was so well received that other areas of the business used my research to implement feedback strategies elsewhere in the organisation.

Project Reflections

This was never supposed to be a significant project so the volume of knowledge gained was really surprising. I had expected there to be plenty of existing research out there that I could draw on. Instead, I found lots of articles with general best practices for designing online feedback prompts but a lack of any data backing up the reasoning for those recommendations. In particular, it was a challenge to find recommendations specifically for feedback prompts within a professional system.

This small piece of research uncovered a huge amount of insight into how users perceive requests for feedback and why it is important to tailor feedback prompts to the audience as part of an overall feedback strategy. This is a good example of how a quick piece of research can glean great insights into an area we take for granted, and can inform a strategic evidence-based approach.

I personally enjoyed being able to create testable models of motivation and ideal timings that could be validated and rolled out across the whole organisation. It was also very satisfying to see the UI designer use the Heuristics checklist successfully to create a prototype for user testing.