Over the years V&V has triggered numerous discussions amongst testers. What does V&V mean? When is it ‘V’ and when is it ‘V’?
Let’s start by demystifying this abbreviation. V&V stands for verification and validation. That’s already reason enough to have doubts, no? 2 words that mean the same, or at least that are used in conversations as synonyms.
Yet, for the testing community these 2 words mean different things. In this article I will first use a number of external frameworks and standards to give a definition of both verification and validation.
After that I will, based on my experience, show how companies can/could/have implemented V&V in their organization.
Let’s start with the definitions.
Definition of Verification
The man of science has learned to believe in justification, not by faith, but by verification.
Thomas Henry Huxley
The purpose of Verification (VER) is to ensure that selected work products meet their specified requirements.
Confirmation that work products properly reflect the requirements specified for them. In other words, verification ensures that “you built it right.”
Prepare for verification
Perform peer reviews
Verify selected work products
Confirmation by examination and through provision of objective evidence that specified requirements have been fulfilled. [ISO 9000]
(A) The process of evaluating a system or component to determine whether the products of a given development phase satisfy the conditions imposed at the start of that phase.
(B) The process of providing objective evidence that the software and its associated products conform to requirements (e.g., for correctness, completeness, consistency, accuracy) for all life cycle activities during each life cycle process (acquisition, supply, development, operation, and maintenance); satisfy standards, practices, and conventions during life cycle processes; and successfully complete each life cycle activity and satisfy all the criteria for initiating succeeding life cycle activities (e.g., building the software correctly).
Verification answers the question: “Did we built it right?”
Definition of Validation
After we’re through with all of the enhancements, I want my clients to say, ‘Wow. Have you seen this?’ … That will be a validation for me.
The purpose of Validation (VAL) is to demonstrate that a product or product component fulfills its intended use when placed in its intended environment.
Confirmation that the product, as provided (or as it will be provided), will fulfill its intended use. In other words, validation ensures that “you built the right thing.
- • Prepare for validation
- • Validate product or product components
(For an explanation on the difference between work product, product and product component I must refer to the CMMi glossary)
Confirmation by examination and through provision of objective evidence that the requirements for a specific intended use or application have been fulfilled. [ISO 9000]
(A) The process of evaluating a system or component during or at the end of the development process to determine whether it satisfies specified requirements.
(B) The process of providing evidence that the software and its associated products satisfy system requirements allocated to software at the end of each life cycle activity, solve the right problem (e.g., correctly model physical laws, implement business rules, use the proper system assumptions), and satisfy intended use and user needs.
Validation answers the question: “Did we built the right thing?”
How to implement V&V?
If you go on LinkedIn, you will be able to see nice discussions on the subject ‘how to implement V&V’ and if I combine these discussions with what I encountered in my career, then I come up with 6 models that explain how to implement V&V.
I will explain these models by using the V-model. You could say ‘the V&V for V’.
These models are:
- Validation = user acceptance
- Validation = testing GO/NOGO
- Validation = requirements GO/NOGO
- Validation = each lifecycle GO/NOGO
- Validation = testing
- Validation = testing + requirements GO/NOGO
The meaning of the individual blocks in the V model isn’t important for the discussion about V&V.
Important to know is that the left leg of the V is all about requirements, specifications, reviewing (static testing). It is about the software system in its virtual stage.
The right leg is all about (dynamic) testing on the real existing software system and the bottom of the V is the build (coding and unit/component testing) of the software system.
A. Validation = User acceptance
At the beginning of the project the business/customer/user defines his needs and from that moment on a development team takes over.
During all the levels in the V-model, the development team (and for the ease of explanation I include the test team) they verify what they are building against these needs.
Once they are convinced that the product is ready for use, they will start a validation phase called User acceptance.
User acceptance is done by the customer/business/user with support of the development team. In some cases, the user acceptance tests can be carried out by the development team. User acceptance testing is documented with proof of execution and summarized in a test summary report.
Based on the test summary report and/or the personal experience of the user acceptance testers, a final decision is made whether the product is what the user/customer/business expected, or not.
This model is a direct consequence of what testing is considered to be in a waterfall methodology. After the definition of the V-model as an enhancement of the waterfall model, the (final) validation remained the final phase, which of course has certain disadvantages. If a customer can only see the final product this late in the development life cycle, there is a considerable chance that the product is not (completely) what the customer expected to see. To mitigate this, some companies use the next model.
B. Validation = testing GO/NOGO
Instead of waiting till the last moment to validate the product, gateways are build in during the testing phases. Most commonly is to define entry and exit criteria for the different test levels. These criteria are discussed upfront and agreed on by the customer. If the organization is coming from model A, in a first step these criteria will be purely based on tests performed by the development team.
However, with increasing maturity, these gateways/criteria will become more than an assessment of measurements made by the development team. Intermediate steps to what these gateways finally should become, include guided demos by the development team for the customer and integration of a customer team in the development team.
Although better than model A, the validation is still late in the development life cycle. In average testing starts when the time to market is already at 66%. This means that validation is done in the last 1/3 of the development life cycle.
C. Validation = requirements GO/NOGO
To start validation as early as possible, some companies believe that having a validation of the requirements is enough and much more efficient. Of course, building in formal approval gateways by the customer during the requirements phases is a good thing, but it remains a huge risk not to have a validation once the product is implemented.
Companies that are mature enough, counter this by using test driven development and/or prototyping. A lot is to be said for this approach and certainly in an Agile approach this setup can work.
Further more, if an incremental methodology is used, the result of the previous cycle can be validated (after the facts) and corrected or changed, if needed, making validation during the test phases less necessary. In an Agile approach the test phases, anyhow, should become very light.
D. Validation = each lifecycle GO/NOGO
At best, the validation can be guided demos for the customer and at worst the customer is involved in all the test levels.
Although this approach covers the complete development life cycle and lets validation start as early as possible, the question has to be asked how many companies can afford to involve the customer from the beginning of the development life cycle and how many customers are willing to be involved during the whole development life cycle.
E. Validation = testing
I used this approach in a series of enhancement projects. Small projects if you consider the impacted functionality, but yet having a time to market of several months. Working at that company we had up to 4 concurrent projects for the same customer, but with the involvement of several other departments. In these projects, we didn’t have a dedicated test team. I was project manager as well as test manager and 80% of my time went to project management.
For the remaining 20% of my time I trained people of the customer to prepare and execute the tests. Because they were people of the customer, the preparation was very basic. During the test execution they would deviate from what was prepared and they would execute additional tests, based on their experience (exploratory testing). My task was to make sure they didn’t forget to describe these tests.
In the end, I can’t say that we did verification. 90% of the tests executed were based on what the customer’s people expected to see and that, in my book, is validation.
Some of these projects were of a technical nature. Upgrades of web services, upgrades of hardware, etc. Things in which the people of the customer were not very much interested, but nevertheless considered as important. For these projects their involvement during the requirements phase was minimal (non existent), resulting in model E.
F. Validation = testing + requirements GO/NOGO
Instead of writing test scenarios and test cases, they would write ‘testable’ requirements. Requirements written from the point of view of a tester.
Danger in this was that once they were in a testing mind set, the requirements would become more and more like pseudo code. This benefitted the development team, but decreased the readability.
Because the customer was involved during the whole development life cycle, the latter didn’t cause a problem.
In the end I have to conclude that there is no unique way to implement V&V and I don’t see any reason to approve or disapprove one of the above models in particular.
What I do see is that involvement of the customer/user/business is imperative. Without them no formal validation of the product, although the amount of involvement can be restricted to the bare minimum.