The Challenge
GDS aimed to address a persistent challenge: the need for rapid analysis of user feedback without sacrificing precision and accuracy.
The solution
Design a AI-powered feedback analysis tool that efficiently categorises and organises data into an intuitive and user-friendly interface.
The process
Research
- How AI can help
- User interviews
- Data analysis
Define
- Problem statement
- User needs
- User journeys
Design
- Figma designs and clickable prototypes
Test
- Usability tests
- Iterate and re-test
Research
Method
Discoveries
How AI can help
Amidst the growing excitement and interest in AI, I embarked on a research journey to understand its capabilities, specifically focusing on how it could be applied to our particular use case.
- Utilising Natural Language Processing, we can effectively categorise and extract the underlying intent from feedback data, enhancing our understanding and response to user inputs
- Implementing AI-driven solutions can often involve significant financial investment
- When appropriately utilised, an AI-driven solution can be extremely beneficial and effective
- Also useful for working out sentiment, intent and removing spam from user input
User interviews
After completing a series of user interviews, I gathered several valuable insights:
- Users would benefit from being offered the choice to explore topics and comments in more depth following an initial presentation of headline information
- The enhancement of feedback information quality presented to users must align with their capacity to effectively utilise and act upon it
- The outputs should effectively convey key aspects such as sentiment, volume, and context through the strategic use of color, proportional representation, and illustrative quotations
Data analysis
Due to the diverse origins of GOV.UK feedback, the data exhibits a high level of inconsistency, necessitating a thorough and detailed analysis on my part.
- There are six primary sources of feedback, each featuring distinct mechanisms for user input
- Spam is a problem making it difficult to distinguish manually due to the sheer volume of feedback
- The absence of categorisation in feedback poses challenges in efficiently assessing and interpreting the responses
Define
Problem statement
The overwhelming volume of feedback data from various GOV.UK online sources, approximately 200,000 records monthly, presents a significant challenge in terms of manageability and analysis. The process of scrutinising this unsorted feedback is not only time-consuming but also incurs substantial costs due to its extensive nature and the lack of categorisation, making the extraction of pertinent user feedback a resource-heavy task.
To tackle these issues, the Data Insights Team has innovatively developed a Natural Language Processing (NLP) pipeline. This advanced solution is designed to streamline the feedback management process by centralising the feedback data from diverse sources. It efficiently automates the classification of feedback into distinct topics, enhancing the ease of navigating through the data. Additionally, the pipeline is equipped with a mechanism to detect and filter out spam records, thereby ensuring data integrity. Furthermore, it incorporates a feature to assess the sentiment of the feedback, providing valuable insights into user perceptions and experiences.
User needs
- I need actionable feedback information
- I need to know the causes behind the numbers
- I need the to base decisions on at least a year's worth of data to avoid issure relating to seasonality
- I need to know what specifically is confusing on GOV.UK pages
User journeys


Design
Prototypes
High Fidelity Prototypes
Using the GDS prototyping kit and Chart.js

Test
Method
Discoveries
Usability testing
- The terminology used in labels requires refinement, as users are facing difficulties in comprehending the various sources and understanding terms like 'Run ID' and 'topic analysis'
- Users indicated a preference for specifying their own time frame using tools such as a date picker or a similar feature
- Some users reported confusion regarding the distinction between 'advanced' and 'explore' options, as they appeared too similar in functionality or purpose
- There is a user need for the ability to view feedback metrics both on individual pages and in an overall summary format
- Some users expressed a desire to have the capability to view and compare data from previous months
- Overall, users displayed a notable interest and enthusiasm towards the prototype
- One user said: "I love this, when can we have it!"
Next steps ..
- Leveraging the insights gained from this planning phase, the next steps involve designing and building the subsequent iteration
- Initiate the process of extracting categories from the data using Natural Language Processing (NLP) techniques
- Outline our strategy to develop a Minimum Viable Product (MVP)
What I learned
Reflecting on this project I learned a few things:
- Utilise realistic data and narratives as much as possible instead of placeholder text like 'lorem ipsum', as users tend to concentrate more on the actual content rather than the underlying concepts.
- Artificial Intelligence is a permanent fixture in our technological landscape, and adapting to its presence and capabilities is essential
- When applied correctly, Artificial Intelligence proves to be an incredibly useful tool