Agile QA Process: That Will Save You Time & Effort

Agile QA Process: That Will Save You Time & Effort

It’s crunch time, but, there’s a problem. Now your project will be delayed and may even exceed its budget. If only the problem had been found sooner. How many times has a launch been delayed or a project exceed its budget due to poor quality or missing requirements? Delivering a quality product is a significant part of creating a successful customer experience. In fact, the latest Capgemini’s World Quality Report states that companies are allocating 35% of their budget to meet their quality assurance needs.

These quality assurance tests tend to happen right before launch or at the end of a sprint engagement. But did you know that the longer an issue remains undetected, the longer it will take to resolve? Although testing phases are necessary when crafting an excellent product for your customers, we can find efficiencies to reduce the amount of rework and missing requirements. In this article, we’ll talk about how to produce high-quality products and protect against effort losses due to rework.

qa-whyqa

Why QA?

From discovery to user experience design and development, we go to great lengths to create a solution that aligns and enhances a brand’s story and message. Unfortunately, if the quality isn’t there to support the strategic narrative, then we run the risk of delivering an inconsistent message. Customers tend to make assumptions when presented with information that is out of our control. This includes error and system messaging, broken links or images, and unintended or confusing behaviors.

So, we implement QA methods to create a product that:

  • Provides your customers with an enjoyable and intuitive user experience.
  • Ensures that the message aligns with business and project goals.
  • Protects against increased costs due to system unfamiliarity and lack of communication between the original project team and the supporting team.
  • Reduces the amount of effort to repair quality issues after the project has launched. Effort increases after a project is live due to the need to assess and determine the issue and re-trace where the issue started.

The hidden costs of testing

The National Institute of Standards and Technology estimates that repairing issues at the very end of the project can require up to ten to fifteen times more effort than if corrected during an earlier phase. Missing a few details during the project’s lifecycle is to be expected. But, if you find the team having to address functional and visual requirements towards the end of an engagement, then you are wasting precious effort and missing out on many efficiencies.

Missing requirements and functional issues often derail schedules and have a detrimental effect on the overall budget. Usually, insufficient communication and lack of visibility into the feature’s scope are to blame. Fortunately, we can limit these instances through an iterative process and by testing early and often.

Avoiding hidden costs

At eHouse, we leverage the agile process and tailor it to meet our agency needs. Due to the iterative nature of agile, we are able to test throughout the entire engagement. This process also allows us to complete all required features and functionality on time and within budget.

The team composition and communication between other team members plays a significant role in our success. Our teams typically contain several disciplines, including a designer and a developer, who work closely together at each stage, beginning to end. Members strive to maintain a high level of communication and alignment on business context, project goals, and critical requirements throughout the entire project.

Testing with agility

Constant oversight between all team members throughout the engagement and an alignment on “doneness” is critical when producing a quality product. Our agile process promotes a high level of focus on one feature at a time rather than the product as a whole. This allows us to iterate and test each feature with high focus before moving forward. To gain further efficiencies we create and review each feature in multiple browsers and environments. This allows us to preview each update in real time and assess its effects on quality.

Below are the steps we take to ensure that each item receives this attention:

  • Define mission critical items and behaviors for each feature. Each item is then tested against business context and project goals.
  • Create a functional prototype and minimal viable product. Here we test for user experience and functional precision between multiple devices.
  • Add visual and UI enhancements such as typography, iconography, color and imagery. To ensure cross-browser compatibility, we then test for usability and visual precision between devices and browsers.
  • Perform systems integration if needed. This includes content management systems and customer relationship management systems or any other solutions. We then test for functional precision, reliability, and performance.

qa-timeaffectslevel

How time affects the level of repair effort

To illustrate how time affects effort, here are several issues that we’ve encountered in previous projects. These are grouped in a way that corresponds with the 4 steps above. Each section will contain a brief description of the issue and estimates based on how long the issue has been left unresolved.

Below is a list of terms used in these examples to clarify meaning:

  • Age – The duration between issue creation and detection
  • Actions – An overview of the tasks performed to resolve issue
  • Microtask – A smaller action that is not actually related to the actual solution, such as opening and navigating files.

Define mission critical items

Issue: Missing contact information in a predominant section (header or footer).

Scenario 1: Reported when testing against business goals and project requirements

  • Age – New
  • Actions – Add contact information to critical items list
  • Effort – 1 – 2 minutes

Scenario 2: Reported when testing the functional prototype

  • Age – 24 to 72 hours
  • Actions – Update wireframes/greyboxes
  • Effort – 5 to 10 minutes

Scenario 3: Reported when adding or testing UI enhancements or systems integration

  • Age – 2 weeks
  • Actions – Redesign and develop section to accommodate new information
  • Effort – 45 minutes to 1 hour

Finding an issue as soon as possible is the goal. By addressing the issue during the planning phase, we avoid effort loss and maximize the use of time.

Functional prototyping

Issue: Dropdown navigation does not work on mobile devices

Scenario 1: Reported when testing the functional prototype

  • Age – New- found during current phase
  • Actions – Make repair while retaining high focus
  • Effort – 5 to 10 minutes.

Scenario 2: Reported when adding or testing UI enhancements

  • Age – 24 to 72 hours
  • Actions – Make repair while retaining lower focus. Introduction of additional microtasks.
  • Effort – 30 to 45 minutes

Scenario 3: Reported when testing systems integration

  • Age – 2 weeks
  • Actions – Reorient and inspect to gain understanding before making the appropriate change
  • Effort – 1 to 1.5 hours

This scenario benefits from our agile testing process and tools like BrowserSync. By developing and previewing changes on multiple devices in real time, we catch functional errors like this as they are introduced. Rather than waiting for the entire section to be completed to test, we can address this issue without the added effort from debugging and orientation.

Visual / UI Enhancements

Issue: Phone number in header is not displaying in Internet Explorer 11.

Scenario 1: Reported when adding or testing visual / UI enhancements

  1. Age – New
  2. Actions – Make repair while retaining high focus.
  3. Effort – 2 to 5 minutes.

Scenario 3: Reported when testing systems integration

  1. Age – 24 to 72 hours
  2. Actions – Make repair while retaining lower focus. Introduction of additional microtasks.
  3. Effort – 15 to 20 minutes

Scenario 3: Reported when performing a final quality assurance check

  1. Age – 2 weeks
  2. Actions – Reorient and inspect to gain understanding before making the appropriate change. Additional microtasks to complete and more factors to consider.
  3. Effort – 30 to 45 minutes

This issue is related to a font-feature-setting introduced after adding font styles. Since other systems were not introduced yet, there were less factors to take into consideration during.

System Integration

Issue: Contact form field validation does not work

Scenario 1: Reported when testing systems integration

  1. Age – New
  2. Actions – Make appropriate change and rename field id/name
  3. Effort – 3 to 5 minutes and retest.

Scenario 2: Reported when performing a final quality assurance check

  1. Age – 24 to 72 hours
  2. Actions – Reorient and inspect, Microtasks (get to right file and section) then make the appropriate change and retest.
  3. Effort – 10 to 15 minutes

Scenario 3: Reported after the update has been added to live environment

  1. Age – 2 weeks
  2. Actions – Reorient and inspect with addition factors to debug and then make the appropriate change.
  3. Effort – 20 to 30 minutes

By making the repair after integrating the form, we save close to three times the effort when compared to the repair estimate when caught durning the final QA. This is a perfect example of how this process saves time and effort as well as prevents losses to customer experience and operation costs as well.

Finding efficiencies

Based on these examples, you can see how iterating and testing products from the beginning allows us to save time in the end. Cross-checking a product against project goals and business context while performing high focus testing in phases will ensure that issue lifespans are limited.

Discovering these efficiencies is also an ongoing affair. As tools and needs change, so will the processes. This makes the ability to adapt and find ways to effectively use resources all the more crucial. Spending less effort through an iterative and proactive process will save everyone time and effort in the long run.

So remember to test early and often!

This article was originally posted on the eHouse Studio Blog here.