Skip to content
QualityLogic logo, Click to navigate to Home

Failing at Requirements-Based Software Testing?

Home » Blogs/Events » Failing at Requirements-Based Software Testing?

Often, requirements-based software testing is defined something like this: “a testing approach in which test cases, conditions and data are derived from requirements.” This sort of circular explanation is annoying — and unhelpful.

What are Software Requirements?

Google defines requirements as “a thing that is needed or wanted”. With this definition in mind, we can apply the concept of ‘requirements testing’ to everyday life. Say you’re holding a dinner party and deciding what to serve. It’s likely you considered your guest list armed with the knowledge that Ann is gluten-free, and Dan is vegan, but you’re not sure about Dan’s partner… you think he may be kosher? You made menu decisions based on your guests’ requirements, and executed your cooking tasks accordingly.

We make decisions every day based on requirements, and software development is no different. Here, requirements are business deliverables — the ‘what’ that provides value — when delivered by a product ‘how’. The software value, then, become the ‘requirements’ and we define tests designed to meet business and user needs by exercising specific software functions or probing nonfunctional attributes, such as reliability or usability. Properly articulated requirements are the starting point for everything: project scoping, cost estimation, scheduling, coding, testing, and release management. Starting a project without a proper analysis of requirements is a recipe for disaster — it’s like building a house without a blueprint.

Requirements-Based Testing

Poorly defined requirements can lead to developing the wrong features, or the right features the wrong way. There’s no denying it feels great to get that code flowing but taking the time to develop clearly articulated and specifically testable requirements prior to development will pay off.

Why? Because this: Studies have shown two things. First, that most defects have their root cause in poorly defined requirements. Second, that the cost of fixing an error is cheaper the earlier it is found. Defects found in the requirements phase via ambiguity reviews cost ~$25-45 to fix. However, if a defect discovered in integration or systems testing has its roots in poor requirements definition, the cost to fix jumps to ~$750-3,000 per defect.

A study by HP and IBM estimated the cost of fixing defects discovered in production to be an astounding $10,000+! Poorly defined requirements lead to a nightmare of re-do; of the requirements, the design, the code, the tests, the user documentation, and the training materials. All this “re-do” work can send a project over budget and over schedule. Taking the time to understand, define, refine, and document requirements throughout the software development life cycle pays off, in a big way.

Defining Requirements for Requirements-Based Software Testing

The distinction between ‘what’ and ‘how’ is often a source of push-back against requirements-based testing. Software design is ‘how’, but the requirements are the ‘what’. The software design must meet stakeholder requirements.

Here’s an example scenario:

Your team is tasked with building an eCommerce site. To be successful, the site has many jobs to do. From the business perspective it needs to generate revenue through sales, secure user data, and allow monetary transactions while also updating inventory and coordinating shipping. From the user perspective, the platform should allow a user to create an account; to search for, find, and compare items; offer a shopping ‘cart’ in which to place their items; and end with a ‘checkout’ stage.

All these tasks should flow seamlessly and be intuitive. Sure, this is a simplified scenario, but you can see that by defining the requirements of the ‘what’ (it should DO) you’ll get to the ‘how’ (it should do the thing). This is exactly why time spent documenting and testing requirements isn’t paralysis by analysis, but a large measure of forward-thinking in the development process…

Testing Requirements: Best-Practice Tips

It might feel like paralysis by analysis to do the hard work of defining requirements, so let’s look at three easy maximums to remember when developing requirements.

1. Requirements Definition Begins with Clear Communication:

Poorly defined initial requirements, and lack of clarity and depth to the original specification document, are frequently the primary cause of requirements creep. Project Managers must be able to clearly communicate to developers the requirements of the software product being developed. To do this, clearly defined business objectives for the project need to be in place — from the conceptual phase through delivery.

2. Eliminate Ambiguity in Requirements Documentation:

Write requirements in a consistent style that allows all users to achieve the same understanding of the requirements. Documentation should be explicit and unambiguous. Each requirement should have a logical cause and effect detailing the expected outcome of a specific test action. Bonus tip: well written requirements can be reused in future projects!

3. Best Practice Requirements Design:

Each requirement should measure a specific ‘what’, should play well with other requirements, and (importantly) not contradict other requirements. Validate requirements against objectives. Without the guidance provided by tracing requirements back to business and project objectives, project teams often rely on their own judgment about whether to include individual requirements. These ‘out-of-scope’ requirements lead to time and budget overrun.

Requirements Based Software Testing – the Takeaways

Defining requirements begins with the first phase of software development where the largest portion of defects have their root cause and the correction of errors is the least costly.

Well defined and documented requirements can then be communicated clearly, concisely and consistently across the project team to develop requirements-based tests focused on meeting project objectives (and staying in scope), while crafting a product that delights your users.

Finally, if you find your project constantly struggling with creep or direction shift, it’s a sign that requirements weren’t developed fully or communicated clearly. Step back, define and refine your requirements using the guidelines above, and take that knowledge to your next project.

We can help with requirements based testing. Schedule a consultation here.

Contact Us

Author:

Gary James, President/CEO

Over the last 35+ years, Gary James, the co-founder and president/CEO of QualityLogic, has built QualityLogic into one of the leading providers of software testing, digital accessibility solutions, QA consulting and training, and smart energy testing.

Gary began his QA journey as an engineer in the 1970s, eventually becoming the director of QA in charge of systems and strategies across the international organization. That trajectory provided the deep knowledge of quality and process development that has served as a critical foundation for building QualityLogic into one of the best software testing companies in the world.

In addition to leading the company’s growth, Gary shares his extensive expertise in Quality Assurance, Software Testing, Digital Accessibility, and Leadership through various content channels. He regularly contributes to blogs, hosts webinars, writes articles, and more on the QualityLogic platforms.