Agile Automated Testing

Contents


More Info

Introduction

Automated testing is seen as a key tenet of successfully agile projects. In this document we examine test automation in the context of processes like Scrum, Kanban and Extreme Programming. We look, not so much at the ‘how’, but more ‘why’ this is important in Agile projects.

Before we look at the different Agile processes it’s important to understand the difference between Agile as a concept and the different methodologies (e.g. Scrum, Kanban and XP). We can think of the Agile concept as a set of values and principles. Those values and principles are set out and defined by the Agile Manifesto. Under this high level set of Agile principles we’ll then find a range of more specific Agile Methodologies. Those methodologies encompass systems like Scrum, Kanban and XP.

We can then think of the methodologies (like Kanban, Scrum and XP) as process frameworks for getting the work done. All of these, to one degree or another, fit to the agile set of values and principals. From a simplistic point of view you can think of the Agile Manifesto as a container that holds the different Agile Methodologies.

It’s important to understand that this view of the Agile landscape isn’t something that the different Agile camps, or the team that wrote the Agile Manifesto, set out to define. This is our high level view. A view that we use to put everything in context and to help you understand the relationships. Quite possibly a view that would be contested by many but, none the less, a view that will help you understand what is to follow.

What is Jenkins and How to Install Jenkins

Module 1

Checking Test System Environment Before Starting Your Tests

Module 2

Test Automation Execution based on App Version Updates

Module 3

Re-running A Failed Automated Test Automatically With Jenkins

Module 4

Running Data Driven Tests with the Matrix Project Plugin

Module 5

The Context

In the sections below we look at the Agile Manifesto and some of the more popular Agile Methodologies. We examine how the Agile Manifesto relates to agile automated testing. Then we take three of these Agile methodologies and examine how they recommend improving release quality with this approach. The Agile methodologies we cover include:

  1. Kanban
  2. Scrum
  3. Extreme Programming (XP)

It’s worth stating that some of these methodologies make more references to automated testing than others. We’ll start out at one end of the spectrum, with Kanban where there is no specific reference to test automation. Then we’ll work towards the other end of the spectrum with XP, where specific references are made.

We should state up front that all of these methodologies are light on practical advice for implementing test automation. They focus more on the desired outcome (e.g. “Delivering working software frequently”). Taking this point into consideration let’s start out by looking at how the Agile Manifesto.

The Agile Manifesto

The Agile Manifesto makes not one single reference to using test automation in agile software development projects. The manifesto was never written with a prescriptive ‘you should do it this way’ approach in mind. As such there’s no formal statement that you should use automated testing. There’s not even a statement that you should test! What there is, however, are some guiding principles that state things like:


And another states:

The agile software development manifesto itself and the twelve principles make absolutely no reference to automated testing. Yet, as you can see, two of the principles do stress the importance of working software. In an Agile project the ability to implement an automation solution is just ONE solution you could use to help you achieve this goal of :

Kanban

The Agile Alliance provides a clear definition of Kanban. If you’re running with Kanban and looking for guidance on Automated Testing then there’s pretty much nothing to go on. Kanban is focused on representing work visually and giving team members visibility of each piece of work at a point in time.

From Lean Manufacturing Tools we see that Kanban is neatly defined as a process that “controls what is produced, in what quantity, and when.” Again emphasising that Kanban is more about tracking what needs to be done, what’s currently being done and what has been done. One of the Kanban rules does state that:


And from this that the quality emphasis is more around being…

Clearly test automation can help in achieving this but Kanban makes no reference to automated testing as a way in which to identify defects.

It’s not that Kanban states that you shouldn’t implement this. It’s not that it doesn’t apply. More that there’s just no reference to test automation in any of the guides or discussions about how to implement Kanban.

Scrum

The November 2017 version of The Scrum Guide describes the roles, events, artefacts and rules for the Scrum methodology. Scrum makes reference to employing “an iterative, incremental approach to optimise predictability and control risk.” In this there are several parts of the methodology that touch on quality and an approach to testing.


Those parts of Scrum that have a bearing on testing, quality, and potentially automated testing, are …

  1. The three pillars that pull out ‘Inspection’ as a key activity needed to detect undesirable variances.
  2. The roles in the Scrum Team where emphasis is given to NOT having dedicated sub-teams (like a QA / Test team) but where individuals may have specialist skills (e.g. a test automation specialist).
  3. The ‘Definition of Done’ where Scrum expects the Team to ways to increase product quality.

It could be argued that each of these points could have a bearing on how you approach testing, and how you may decide to employ this QA discipline in a Scrum based project. We discuss each of these three aspects of Scrum below. We also attempt to identify how they may influence your decision to employ this method of ensuring higher levels of quality in your Scrum based project.

Please understand that the following is NOT an overview or guide to scrum. The following identifies the parts of the Scrum Methodology that touch on quality. We attempt to put this in the context of how automated testing might come in to play as part of a focus on improving quality.

The Three Pillars

Scrum identifies three pillars that are significant to the process:

  1. Transparency
  2. Inspection
  3. Adaption

It is ‘inspection’ that bears relevance to our point of focus. The inspection pillar making the point that:

This does not, in anyway imply that automation is THE solution to detecting undesirable variances. The point is that users must “frequently inspect”. Any method could be used to achieve this. It’s worth mentioning too that Scrum emphasises that inspection should not be so frequent that it get in the way of the work.

The Scrum Team

The guidance given on how the scrum team should be organised identifies the Product Owner, The Development Team and Scrum Master as the key roles. From our automated testing point of view it is only the Development Team that is of significance to us. The following three characteristics of the development team being of interest:

  • Development Teams are cross-functional, with all the skills as a team necessary to create a product Increment
  • Scrum recognises no titles for Development Team members, regardless of the work being performed by the person;
  • Scrum recognises no sub-teams in the Development Team, regardless of domains that need to be addressed like testing, architecture, operations, or business analysis;

There is no suggestion that there should be dedicated roles/sub-teams like testing or a test automation sub team. Scrum does state however, that…

So you may bring to bear the skills of a dedicated tester and/or test automation engineer in order to improve quality. However, the responsibility to implement and manage any work in this area comes down to the whole scrum project.

Definition of Done

Scrum also assigns significant important to sharing “a common definition of Done”. Everyone in the team is expected to understand what “Done” means. This definition stresses that:

It is expected that Done defines the criteria to asses and improve quality.

Whilst defining ‘Done’ is critical at the start of a sprint it’s the retrospective where things potentially touch on automated testing. The retrospective, after the sprint is complete, should address:

Clearly, at the discretion of the scrum team, this could identify automation as a way to increase product quality. It’s important to note that The Scrum Methodology is NOT prescriptive about how product quality should be improved. It just emphasises that the scrum team itself should look at ways in which product quality can be improved.

Extreme Programming (XP)

Whilst seemingly not as popular as other methodologies like Scrum and Kanban, XP is a little more detailed when it comes to defining how software QA and testing fits into an agile software development process. This may be because one of the most important tenets of XP is that it “stresses customer satisfaction”.


Customer satisfaction comes from delivering what the customer wants, with a level of quality that meets or exceeds their expectations. That quality level largely dependent on a QA process that monitors and guides the rest of the process.

Even if you are not running with XP there are significant parts of this methodology that apply to testing and QA that can be used and implemented in other agile methodologies.

There are two parts of XP that touch on the aspects we’re interested in. Firstly, we have a list of 29 simple rules in XP broken down into five categories:

  1. Planning
  2. Managing
  3. Designing
  4. Coding
  5. Testing

We can see here that we have a whole category dedicated to testing. We’ll see later that there are 4 specific ‘simple rules’ in this category too.

Secondly then, we have the XP Process Flow Chart. This identifies ‘Test Scenarios’ and ‘Acceptance Tests’ as dedicated parts of the overall XP agile process. Again with XP we have dedicated parts of the process that address the quality and testing.

In the next two sections we look at these simple rules and the relevant parts of the XP Process Flow.

XP’s Simple Testing Rules

In XP there are 4 rules covering testing as part of the XP agile development process. These four rules cover:

  1. Having unit tests
  2. Making sure unit tests pass
  3. Creating tests when bugs are found
  4. Having acceptance tests run frequently

Yes, we can consider Unit Tests as automated tests. They require no manual Intervention for starting the execution and generating the results. If developers create unit tests, run them and they pass then it’s an effective automation win in an Agile project.

The acceptance testing component is important from our test automation point of view too. XP expects acceptance tests to be created from the user stories. Those tests need to be validated by the customer. The customer is then responsible for making sure the tests confirm the right functionality and right quality levels are met. What really stands out though is that XP specifies that:

From which you could reasonably expect automated testing to be a method in which to achieve this. XP goes one important step further though. XP states that:

Clearly there’s no guidance about how you should approach the implementation. It is, however, the only categorical statement in all three Agile processes (XP, Kanban and Scrum) that states that tests should be automated.  Tests need to be automated so that they can be run often.

XP Process Flow Chart

The other aspect in XP that’s of interest to us then is the XP process flow chart. Whilst not making any specific references to automated testing this does help put things in context and highlight the relationships within the overall process.

The User Stories should be driving the Test Scenarios. The Test Scenarios then driving the Acceptance Tests. And of course, where possible, those Acceptance Tests should then be automated so that they can be run regularly. That regular execution forming part of the regression test pack for future releases too.

The XP Process Flow Chart does highlight one point that could be considered an oversight. This is that we’re either dealing with Unit Tests at the front end of the process or Acceptance Tests at the back end of the process. There’s little expressed for the middle.

Having said that, XP does appear to provide more guidance on the test aspects of agile development than both Scrum and Kanban. Whilst XP doesn’t preclude adding further testing across the whole process, it’s important not to overlook other forms of QA. For example, well before Acceptance Testing takes places it may be wise to implement some for of API automated testing.

Summary

The Agile Manifesto and the different Agile Methodologies don’t so much prescribe “how” you should implement automated testing. It’s more that they explain what the end objective is. For example a key objective is that you are frequently delivering working releases. From this you can infer that automation could be a key component to an Agile project. Without the capability to test fast and assess quality fast you can never be sure that you’re delivering a working release.


When it comes to the specifics, in Scrum for example, it’s stated that the retrospective should examine “ways to increase product quality”. Automation for quality assurance is clearly one possible approach to achieve this. In XP things get a little more prescriptive, where it is expected, that ‘acceptance tests should be automated so that tests can be run regularly’. Conversely with Kanban there’s even less to go on. With Kanban you could argue the case if you’re looking to “identify defects as close to the source as possible”. This is hardly a prescription to use automation though. Considering these points we’d argue that the crux in all of this is as follows:

It’s the agile objective that’s important. The solution you use to hit that objective is up to you. That solution may or may NOT be automated testing. At the outset maybe it would be wise to consider if test automation really is the right solution. Perhaps even a bit of RoI analysis upfront might be a wise step before committing to agile automated testing?