This Website is not fully compatible with Internet Explorer.
For a more complete and secure browsing experience please consider using Microsoft Edge, Firefox, or Chrome

Credibility is Everything

Credibility is Everything

(Or How to Get the Most from Verification and Validation…)

5-minute read
Sinothile Baloyi - March 31st 2022


“I have not failed. I've just found 10,000 ways that won't work.”
- Thomas Edison

Sure, and well done for persisting Edison, but how much did that cost in time and materials? Yikes!

Luckily, we have come a long way since, and nowadays we can simulate and perform those 10,000 iterations within much, much shorter time frames and without the cost of physical testing. Great for the company’s finances and the environment, not to mention the engineers – after all, who really wants to repeat the same physical test 10,000 times?

But – there’s always a but! – how can we trust the results of our simulation? This is where Validation and Verification (V&V) comes in, usually at the end of the simulation process. But, what can we do from the outset to make sure that what we end up with is credible?

NAFEMS Americas and Linda Knudsen from Syncroness recently came together to present the webinar ‘Define and Deliver Credible Results’ and shed some more light on the subject. Specifically, Linda focused on the ASME V&V 40’s risk-based credibility assessment framework.

Linda’s aim was to demonstrate the value of establishing requirements for producing credible results before executing the simulation.

Credibility is everything (or how to get the most from verification and validation)

To start with, Linda describes the origins of the V&V 40 standard. She explains how an increase in the use of computational modelling and the reporting of results by the medical device community to regulators – primarily the FDA – meant that there needed to be agreement when it came to ways and methods of reporting those results.

Given the potentially significant consequences of decisions made within the medical device community, interested parties from device manufacturers, academic groups, consultants, software developers, and government agencies worked together to develop a standardized framework. Members of the V&V 40 subcommittee brought knowledge from solid mechanics, fluid dynamics, electromagnetics, kinematics, and many other types of physics-based modelling. Today:

‘The FDA recognizes V&V 40 as a consensus standard, it encourages computational modelling and has published guidance for how to report the computational modelling studies, not to impose restrictions but to create a common language for communication.’

While Linda acknowledges that the standard might not be all things to all people-it does have ‘medical devices’ in its title- the V&V 40 subcommittee does consider it to be general enough for application to other physics-based disciplines. Linda points to the foundational documents of the standard – Sandia’s Predictive Capability Maturity Model, the PCMM[1] and NASA Standard 7009 [2] for models and simulation – as a key reason for this belief:

‘Matrix frameworks were used as the starting point for the standard. The Sandia predictive PCMM[1] model, for example, has different levels of maturity, but it doesn't link that maturity with how the computational model can be used to support a decision. And then the NASA Standard 7009[2], which those in the aerospace will be very familiar with, prescribes a required level for each V&V activity for each risk level, which is appropriate for their industry. For medical devices, the V&V 40 committee thought that the individual organization should own the responsibility for determining and communicating risk.’

All this is worth taking into consideration before you dismiss the V&V 40 standard as the sole preserve of the medical devices industry.

A summary of the framework

Verification & Validation

Linda explains that ‘The entire standard is a framework that covers a process of risk-informed credibility assessment, and it is used to establish credibility goals for computational models for a specific context of use and for a specific model risk.’

Linda does stress what the framework is NOT–

  • It is not a step-by-step guide to modelling or VVUQ (Verification, Validation, and Uncertainty Quantification, to give V&V its full and official name)
  • It is not for beginners
  • And it will not specify the rigor required to assess the credibility of your computational model.

What the framework is designed to do is to ‘make practitioners think about the evidence needed and to have a good discussion about it.’ Linda explains that at its core the framework requires that model credibility be established in relation to model risk.

‘Model credibility is the trust in the predictive capability of the computational model for context of use established through the collection of evidence from VVUQ activities’

Key framework steps:

  • Define the question of interest, i.e., what is the reason for the investigation? Defining this at the outset is a key factor in setting the parameters of your investigation and remaining within those boundaries.
  • Define the context of use of the computational model used to address the question of interest. This may involve characterizing or investigating some aspect of technical performance and will define the extent to which the computational model influences the decision, relative to other supporting data.
  • Assess the Model Risk. This is where you establish the significance of any adverse outcome resulting from an incorrect decision, is it low, medium, or high?
  • Establish level of rigor for each activity. Ensure that there is a sufficient level of rigor for the given model risk.

To illustrate how the framework can be used, Linda uses a practical example - a tibial tray component of an artificial knee implant - to demonstrate how these different steps come together to form the basis for defining and delivering credible simulation results.

The framework allows you, from the outset, to set goals for credibility. What better way is there to ensure success than planning for it? Without credible simulation results, we might as well go back to Edison’s way of doing things, and no one wants that!

Want to see how your own computational model could fit within the framework and the benefits to be had from using such a framework to convey the credibility of your results to stakeholders?

Click here to sign in / sign up and view the webinar.



Sandia National Laboratories, "V&V Credibility: Predictive Capability Maturity Model (PCMM)," [Online]. Available: [Accessed 03 February 2022].


National Aeronautics and Space Administration (NASA), "NASA-STD-7009, NASA TECHNICAL STANDARD: STANDARD FOR MODELS AND SIMULATIONS," 11 July 2008. [Online]. Available: [Accessed 03 February 2022].