When a group of people comes together for a new piece of work, you never know how they’ll gel. It’s therefore to the great credit of the team behind the China Visit Visa exemplar that they have formed a tight agile unit to help the service pass its GDS live assessment.
It’s been a long road to get the service to live, and we’ve had to negotiate various bumps in the road along the way, but here’s how we got our exemplar over the line.
It has been a challenging few months for the whole Visa & Immigration Digital team. Last summer, the China Visit Visa exemplar moved into public beta but it was clear there was a lot of work to be done – not only on the service itself, but on the make-up of the team. Frankly, there were a few holes that needed patching up if this boat was going to stay afloat as far out of port as China.
Key to our success at this stage was an improvement in the working relationship with policy. A collaborative working relationship has now been developed with subject matter experts, to the extent that it’s impossible to see how such a project could be run any other way.
In fact it’s turned into a three-way relationship, between the Visa & Immigration Digital team, policy and subject matter experts, and GDS content team who deal with the relevant content on GOV.UK for visit visas. We now feel we’ve got a great way to work to make the changes we need on each platform to ensure the best experience for users.
Using other exemplars
One of the strengths we knew we could call upon for the assessment was our mining of other exemplar digital services for design patterns and content styles. The Register to vote, Carer’s Allowance and Lasting power of attorney exemplars in particular provided great insight into how a service should look and feel while trying to provide vastly different services to users.
And each of the three is already live, so we figured we’d be on safe ground if we examined them closely to see where the GDS assessment panel had praised their work.
Setting out our assessment strategy
We knew from the off that an assessment should be no more than the successful culmination of a project done well. Nonetheless, it wouldn’t have been much use going into the assessment without a very clear strategy to demonstrate to the panel that we’d started with user needs, put plans into effect with evidence to back them up then returned to those original user needs to judge their success.
We knew we had the research and data to back up what we planned to say to the panel, such as 60,000 successful Chinese users during public beta with a far better user satisfaction level than the journey on the alternative platform, known as Visa4UK.
But we knew it would all fall apart without a proper presentation. Four of us were chosen to attend the assessment: our scrum master would lead the demo, our user researcher would provide the bulk of our evidence, one of our best developers would take care of technical questions and I myself, as Digital Service Manager, would aim to kick it all off with a powerful introduction.
We did at least one practice run a day in front of all kinds of different interested parties – and we can admit now that not all of these run-throughs went well. Thankfully we learned something from even our less impressive attempts, and eventually got to an assessment strategy we were all happy to do our part in.
It’s important to stress that the four members of our assessment team could never have reached a winning formula for the assessment without the constant backing and support of the rest of the team. We felt like we had the whole of Visa & Immigration Digital coming in with us.
Finally, the assessment
Though I was tasked with presenting the introduction, our opening gambit in fact came from the mind of our user researcher. The GDS assessment panel entered the room to find themselves surrounded by vast swathes of printouts packed with post-its – it would be fair to say our user researcher is very visually minded! The whole user journey was on the walls, complete with pain points identified and improvements in progress or already completed, plus personas and all sorts of other research.
This helped set the tone – knowing they’re not simply going to sit there passively listening to a long-winded justification of a service, but instead be led through it by clear (and colourful) user research, must allay fears among assessors of a tedious afternoon ahead.
Following a short introduction we were led into the main thrust of the assessment – the demo of the online service, skilfully marshalled by the team’s scrum master. Questions from our assessors were welcomed throughout, and the demo was followed by a robust Q&A. We’d honed our answers in numerous run-throughs over the preceding fortnight, but they still managed to challenge us.
I won’t deny it took a trip to the pub across the road to calm the nerves after our grilling, but we were confident we’d done the best possible job of justifying the decisions made so far and the actions we were still planning to take to ensure the China-focused service can expand to include all visit visas worldwide. And it came as superb news to everyone who has worked on the project when, a few days later, we got the result we’d all hoped for.
We didn’t need the pub this time, but we went anyway!