This Website is not fully compatible with Internet Explorer.
For a more complete and secure browsing experience please consider using Microsoft Edge, Firefox, or Chrome

BENCHmark January 2005

BENCHmark January 2005

Data Management - the Theory in Practice

BENCHmark January 2005

In this Issue:


The increasingly proliferate use of simulation tools such as Finite Element Analysis (FEA) and Computational Fluid Dynamics (CFD) is one of the engineering success stories of the last few decades. All those who are involved with the technology can – justifiably – claim to have developed products with improved quality, increased safety and/or reduced costs. Yet these most fundamental of questions can still cause difficulty.

NAFEMS has always endeavoured to play a leading role in improving the confidence that can be placed in simulation results. In the early years, this focused on developing internationally recognised benchmarks that software developers could use to verify that their algorithms were correctly coded.

In more recent years, although new benchmarks are still being actively developed by the organisation, the emphasis has shifted more towards demonstrating confidence in the entire analysis process. For example, the Knowledge Base article in this issue refers to the SAFESA procedure with which NAFEMS was involved.

However, there are no easy – or complete – answers. It remains a highly topical subject. One of the earliest posts on the recently formed North American discussion group raised precisely this issue, and quickly generated a flurry of replies.

Many of the meetings of the FENet project have spent a considerable amount of time discussing “fitness for purpose” and related topics. For example, the meeting in Glasgow had a lengthy session talking about the definitions of Verification and Validation as well as the responsibilities of both the software developer and the analyst. It was interesting to note that a reasonably representative selection of experts found it difficult even to reach a consensus on defining the terms Verification and Validation. So much so, in fact, that one of the first tasks that the newly formed Analysis Management Working Group has set itself is to agree some standard definitions. (Contributions to this debate are welcome at the Analysis Management Working Group section of the website).

Over the coming years, much time will be spent debating these topics at NAFEMS events and developing material to help engineers to provide answers to the questions raised.

Tim Morris Chief Operating Officer
January 2005