Skip to main content

https://digitaltransformation.blog.gov.uk/2013/11/26/testing-the-visit-visa-exemplar/

Testing the visit visa exemplar

Posted by: , Posted on: - Categories: Exemplar projects

Until recently I worked at the Test, Design and Consultancy Services (TDCS) as the testing lead for the Home Office visit visa application. This was the first project I’ve ever done agile testing on.

My colleague Mat wrote about what it was like to implement from his perspective, and I have to say I wouldn’t go back to waterfall.

Share responsibility

Two of us were assigned to work alongside the team during the alpha; myself (a test lead) and John Green, a test engineer who worked on things like automation and scripted tests.

Initially though I worked on the project on my own. I worked closely with the developers in a continuous integration environment. My main role, in that team, was exploratory testing. Whenever a story was ready for testing, I'd deploy it to the test environment and try it out.

Responsibility was shared throughout the team; myself, a product owner, a scrum master, about five developers, a designer and a researcher. Everyone took on a share of the responsibility for testing the service and making sure it met the criteria we’d set for each story or iteration.

Get the most out of specialists

I also worked closely with Leonie and Josh, who are GDS’s accessibility experts. We'd share the application whenever we felt we’d made changes significant enough to merit further input from accessibility specialists.

We learned early on that sharing feedback over email didn’t always work. People might have a different interpretation of different defects, or the solutions.

We've changed the approach for the Beta so the team will work as a group to review and improve the site whenever specialists are helping us test the service. The important thing is getting everyone in one place - around a monitor if necessary - so we have a shared understanding of the problem, the context and what the solution might be.

At the same time we’ve realised how important it is that people work using common tools. We’re building up some recommendations internally so that teams at TDCS have access to shared tools, and we find the right things to use when we work with external suppliers so we can monitor progress on the defects we surface in the service.

No more waterfall

The old way, you occasionally felt like developers saw you as the enemy - trying to break the things they’d built and leaving them to fix it - but working together, sharing problems as a team, felt much more productive.

Sharing and comments

Share this page

6 comments

  1. Comment by Jon L posted on

    Hi Kim
    Thanks for the informative article. Are you able to share any common tools around testing and communicating results of said testing please?
    Many thanks

    Reply
    • Replies to Jon L>

      Comment by Mat Costick posted on

      Hi Jon - We use a variety of open source and low cost licensing tools to support our exemplars and would happily discuss our findings. Our primary toolset for Continuous Integration on the Visit Visa exemplar is Jenkins (CI) and Maven (Build) using Selenium for Automation and JMeter for Performance Testing. In terms of communicating our results we use the show and tells to display our code and gauge user feedback and Kanban boards to show where we are within our sprints. Going forward we also plan to use the Atlassian ‘Jira Agile’ dashboards to provide a wider visualisation of our progress. We use other tools for UX, compatibility and accessibility tests that I am happy to provide more detail for if you want to email me? The key for us is to use proven, industry standard, open source tools as much as possible so that we can leverage the wealth of experience already out there.

      Reply
  2. Comment by Paul posted on

    Interesting read. I would be interested in hearing more about how the team shared the testing, how was this done? Did you have test scripts in place and share those or rely on everybody creating their own for example?

    Reply
    • Replies to Paul>

      Comment by Mat Costick posted on

      Hi Paul - We are using Test Driven Development and automate test scripts as much as possible so they can be used in our Continous Integration tool. Testers and Developers work closely together all the time. We also use Test Charters to plan out our structured exploratory tests which are shared within the team. Finally we use UX experts to test the output with users and feedback into the team.

      Reply
  3. Comment by Adam K posted on

    I can see the merits of testing in an Agile way, but in this period of transition from legacy to digital we will more often than not have to find a mixed approach to testing and an effective way of testing the end solution state across both Agile and waterfall developed components. Was there any learning of that which could be shared - I'd be very interested if there is...

    Reply
    • Replies to Adam K>

      Comment by Mat Costick posted on

      Hi Adam - We are also in the space of operating in a mixed delivery mode of new agile projects against waterfall legacy systems. It is early days for us but the proposed approach for us is to do as much testing as possible against 'mock services' and to plan in an 'E2E' test with the legacy providers as soon as the 'waterfall' component is available. We are expecting an iterative waterfall approach from these suppliers and will share our experiences on a future post!

      Reply

Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.