The GOV.UK Design System contains styles, components and patterns to help teams across government to create user-centred digital services. We have a community backlog of contributions that are sent to us by the community. These are ideas for components and patterns from the community. As part of our role as a team we look through the backlog to decide on what we need to work on next.
The backlog has new items and iterations for existing components, patterns and styles. Trying to work out what comes next on our list of work is hard. We want to deliver things that will help the community most.
The original process of creating a prioritised list with the working group was not originally set up as a repeatable process. To reduce the amount of work we ask the working group to do, we decided to focus on prioritisation within the team, ensuring we can work on the things our users need the most. We plan to run the prioritisation process once a year.
One of the priorities of the GOV.UK Design System team is to create and iterate components, styles and patterns. Knowing what to work on next is not a simple process. Prioritisation helps us to know what we will be focusing our efforts on.
Next we needed to understand how we were going to prioritise the work. To do this we mapped out a list of possible criteria to assess the backlog items against. Examples include things that:
There are different key criteria for new things compared to iterations. For new things we wanted to backlog items that will be used by the most services. We want to make sure that we add things that are flexible and useful for different contexts. By doing this, we can show how valuable the design system can be. For iterations we have a broader range of criteria. For example, we may need to work on an existing component or pattern to make sure it remains accessible.
We decided to conduct a user survey, the main purpose of which was to ask users for things that they had expected to see, and would like to see, in the GOV.UK Design System. By asking these questions we could see what other design systems are doing. It also gave us the opportunity to assess which components and patterns were most needed.
We determined that open survey questions would give us more genuine answers. Closed questions risk people choosing items that they feel they should. This means that we don’t get an accurate representation of what users need.
Several different disciplines were involved in designing and iterating the survey. We decided on 6 questions, with the first designed to be a screening question. We asked users if they had used the design system to design and build services within the past 2 years. Those who chose no were told that this survey wasn’t suitable for them and suggested other ways they can get involved.
We gave users 2 weeks to fill out the survey. We shared the survey on our mailing list and several key Slack channels. By using these channels we were able to get a representative cross-section of our community. We made the decision to use these channels as we specifically wanted to hear from people who use the design system.
By the time of close we had:
Once the survey closed we used the scoring system we had devised to work out what priorities we should work on next. We took the responses from the community and grouped them into overall topics. We then boosted scores if they met the following criteria:
We then asked the team to do a more in-depth analysis of the top 12 results as part of a workshop.
We will publish the results on the community pages on Github and the GOV.UK Design System page. With the help of the community we will create a backlog that reflects the needs of the community.