The Challenge

The immense volume of over 700,000 pages on GOV.UK presents a significant challenge in terms of content management, particularly when it comes to conducting manual audits and locating specific information.

The solution

GovSearch, a “business intelligence tool” that enables users to audit the GOV.UK content estate.

The process

Research

  • Current tools
  • User interviews
  • Data analysis

Define

  • User needs and types
  • Feature list and backlog
  • Userflows and journeys

Design

  • Figma designs and clickable prototypes

Test

  • Usability tests
  • Iterate and re-test

Research

Method

Discoveries

Current tools

  • Users currently employ a variety of tools, including Google search and analytics as well as the GOV.UK site search, for their information needs
  • These tools, however, are predominantly manual and lack specificity in the context of content auditing
  • Following the collection of data from numerous sources, Excel is utilised to organise and filter the data

User interviews

After engaging with end users, I identified the primary issues in the existing workflow as follows:

  • The amount of time required to carry out these manual tasks is excessive
  • The substantial time dedicated to collating and analysing data leaves content designers with insufficient time for their regular duties
  • The absence of a unified source of truth leads to noticeable duplication of effort

Data analysis

In order to comprehend the metadata and data structure of the pages, I focused on several key aspects:

  • Due to the use of multiple publications and applications for content publishing, variations in data formats and structures are present
  • There are complications related to taxonomy tags, including their excessive number and the fact that some users are either unfamiliar with or have never utilised taxons in their publishing processes
  • The existence of numerous types of pages, each with its own set of metadata, adds to the complexity
  • The current state of taxonomy tags is disorganised and necessitates either a comprehensive site-wide audit or the implementation of a more effective solution to maintain consistency

Define

User needs and types

The user base is segmented into three distinct categories, each with varying levels of expertise and usage patterns:

  1. Novice: These are primarily senior leads who do not engage in content publishing regularly but occasionally explore the tool
  2. Mid-Level: This group consists of users who access the tool two to three times a month, primarily for investigating content patterns
  3. Power Users: These are individuals who utilise the tool daily for content auditing and to fulfil departmental requests

Feature list and backlog

Following the collection of findings, several were transformed into actionable user stories and features, for example:

  • A feature allowing users to search for specific links on pages
  • The capability for users to customise and rearrange headers on the results page for enhanced usability
  • Functionality that enables users to view the frequency of keyword occurrences across content
  • Advanced options for sorting and filtering results to streamline the data analysis process

Userflows and Journeys

Here is an example of the simple user journey for ths app

wireframes image

Design

Leveraging Figma, I efficiently created initial design mockups, and then, utilising the GDS prototyping kit, I developed clickable versions of these designs. This process allowed for rapid iteration and user feedback on the proposed interfaces.

A/B Testing designs

I tested different ways of presenting the results page. A table view Vs a show/hide summary view

wireframes 2 image

Clickable prototypes

By developing prototypes that closely mimic the actual product in terms of appearance, functionality, and user interaction, more accurate and effective results are achieved. By integrating real data into the prototype, I was able to conduct user testing that yielded high-quality insights, further enhancing the development process.

govsearch-screen-2 image

Test

Method

Discoveries and insights

Usability testing

  • Preference for Tables: Users preferred tables because they allow easy comparison of each row, which is crucial for analysing results
  • Customisation Importance: Given the diversity of user types and their unique usage patterns, the ability to customise the interface is essential
  • Clarification of Labels: Some users experienced confusion, particularly with labels like "All" versus "Any keywords", indicating a need for more explicit explanations
  • Flexible Customisation: Users would benefit from being able to tailor their views and date ranges according to the specific task at hand and the type of content being analysed
  • Spacious Table Layout: There is a desire for a more spacious table view for results. Some users felt that the filters overshadowed the results
  • Advanced Filtering: Users expect the functionality to further refine their search results by document type
  • Taxonomy Flexibility: Users focusing on taxons expressed a need for greater flexibility in choosing and customising the metadata to include in their table views

Next steps ..

  • Create tickets to address new features and prioritise
  • Monitor feedback and analytics
  • Build and test new features

Three things I learned

Reflecting on this project I learned a few things:

  • Priorities can often change
  • Good design makes things obvious
  • Good design puts users in control