My digital journey through Brighton & Hove City Council

Today is my last day managing the Digital Communications Team at Brighton & Hove City Council. As I move on to pastures new, I’ve been reflecting on the two years I’ve spent working here.

The digital landscape is constantly changing, as is the way that residents want to interact with the council.

Customer First in a Digital Age

This change is something the council have been addressing through the Customer First in a Digital Age (CFDA) programme, and it’s been a key part of what I’ve been involved in. I’ve been proud to lead a talented and passionate UX (user experience) and content team, who are striving to improve the council’s digital offering to the residents and businesses of the city.

Some of the changes I’ve been involved in delivering include:

Moving on

As the Digital Communications Team and the CFDA programme move forward and I move on, I’m confident this digital customer focus will only increase, and that the team will continue to deliver digital experiences that are truly user focused.

Ben Hills-Jones, signing off for the last time….

 

Usability testing – how we run our sessions

We need to ensure that our digital services are accessible and easy to use.

We do usability testing to find out how people use the digital services we design and build. We record how people use our website and forms as they attempt to complete a set task and we ask for their feedback. This helps us understand how people use things in ‘real life’ and if this matches our expectations and designs. We take the feedback from usability testing and use it to improve the user experience (UX) as we build.

I wanted to share some things I learned from our most recent usability testing sessions.

a screenshot of a display showing council tax account information
A work in progress – our “view your Council Tax account” design

Alex and I carried out some usability testing on a new online service that will enable people to get information about their Council Tax account online.

I was really happy and felt the sessions went well. We met some people, ran through some scenarios and learnt some very useful stuff about how people were navigating through the task.

At the end of the day I wrote up some brief notes on how the sessions went and the things that I wanted to take forward to use in the next usability testing session. I smiled as I typed them up. Life was good. It was a beautiful spring afternoon. The sun shone. I ate lunch in the park.

However, and there’s no easy way to break this, the next day back in the office I was devastated to find that we lost all our audio for our testing sessions (in fact it never recorded). This was despite checking all our equipment and set-up before each session. (Cue very sad faces, and disbelief at how this could possibly happen) Pausing the software before each participant began was the cause of the problem).

Tips for running usability testing

Check your recordings after each participant

The lesson here isn’t to ‘not pause software’. It’s that if we had checked the quality of the previous session, before we started the next one, we would have spotted the error straight away, and been a lot less sad.

If you carry out usability testing, and are looking to learn from this blog, then the one thing I would like you to take away is to check your recordings. Check them after each participant and catch your unexpected equipment/software failures before they ruin your week.

You may find that you need to make smaller improvements, like adjustments to lighting or microphones.

We were able to salvage the situation with our notes, and the screen recordings without audio were more enlightening than I thought they would be.

Finally, here are the four other things that I felt were key for helping a day of usability testing run smoothly.

Devices

We use a laptop (MacBook Pro) for our usability testing as that’s where our screen recording software is installed.

We asked about internet use when recruiting participants for our testing, but we didn’t ask people about experience of different devices.

However, some of our participants had never used a mac before and appeared a little daunted. We added a simple question to our script, asking “do you usually use a Mac or a PC?”. Asking this question at the start of the session enabled us to provide reassurance to our participants that they didn’t need to know anything about Macs to complete their task.

Are you sitting comfortably?

Explicitly inviting participants to “settle in” made a real difference. Things to consider here are the position of the mouse and the keyboard. I positioned the mouse below the centre of the keyboard. This prompted participants to move it to where they needed it to be. I also encouraged people to adopt a good posture (mostly pulling the chair up closer to the table). Bonus for us, getting closer to the screen captured a better recording on the laptop’s web cam. (If we had an external webcam we could adjust its position)

Encouraging “thinking out loud”

At the start of each session we ask our participants to describe what they are thinking, doing and looking at as they go through the task, however sometimes this doesn’t happen.

Reminding participants, as they started the task, made a big difference to the amount of verbalisation. It’s really helpful to us when participants verbalise what they are doing and thinking as they complete the task. It provides an additional layer of feedback and creates a powerful impact when we share user testing sessions with our team. Of course verbalising is only useful if you capture the audio (*weeps*).

“Why am I here?” Reiterating the goal

We found that reminding participants during the session of the goal / task that we’d set was really valuable. Losing track of the goal could be a sign of high cognitive load, or that the process design is not supporting the task as it should, we are dealing with an artificial scenario in an artificial situation.

Onwards and upwards to the next session.

A new beta website for Brighton & Hove City Council

As part of the work we’re doing to transform our digital services, we’re looking at the current Brighton & Hove City Council website.

The site has over 5,000 pages and can sometimes be difficult to navigate and find what you’re looking for. Some of the pages are also too long and not written clearly enough.

Improving the user experience

To improve the experience for our users, we’ve been looking at:

  • the number of pages on the site
  • the sections of the site
  • the navigation around the site
  • the analytics about how people use the site
  • the content that’s on the pages, and
  • the design of the site.

We want to create a new website for the city of Brighton & Hove which is:

  • well written, using plain English
  • task-focused
  • user-focused
  • transactional (allowing users to self serve online)
  • simple to navigate, and
  • accessible on a range of browsers and mobile devices.

    Our new beta website
    Our new beta website

As part of this development of a new website, we’ve launched a new ‘beta’ website.

What is a beta website?

GOV.UK state that “The ‘beta’ label means you’re looking at the first version of a new service or web page”.

To create this beta site, we worked with a local agency to conceive a new design.

The design is based on the concept of ‘patterns’. Patterns are a repeatable solution to commonly occurring issues, tasks or functions that can reused across multiple sites.

Our patterns will be used across a variety of sites and other digital channels, such as the new ‘My Account’ site.

The new site is a work in progress, and currently only has limited content.

Over the coming months we will start to create more content on the new site.

Tell us what you think….

We’re always keen to hear what you think, so please either: