This Website is not fully compatible with Internet Explorer.
For a more complete and secure browsing experience please consider using Microsoft Edge, Firefox, or Chrome

BENCHmark January 2008

BENCHmark January 2008

Statistically Speaking....


BENCHmark January 2008In this Issue:

 

When the National Lottery was first introduced to the UK a decade ago, many were excited at what seemed like a very real chance of instant riches. Statisticians pointed out that we were 175 times more likely to be murdered in the street than to win the lottery. One even went so far as to point out that you are five times more likely to get knocked down by a bus on your way to get a ticket than you are to win!

Facts such as these seem to be counter-intuitive, and perhaps they illustrate that we are not very good at understanding uncertainty and risk.

As engineers, we are traditionally taught to find the answer to a specific problem. But we all know that the real world is not deterministic, and that there is a degree of uncertainty associated with material properties, loads, boundary conditions, geometrical dimensions to name but a few.

In the past, in the world of engineering analysis, our difficulties in comprehending such complexities have been compounded by the restrictions imposed by the available computing power. This has limited the extent to which statistical approaches could be applied in a routine manner to a wide variety of applications.

Within the NAFEMS community, the recently formed Stochastics Working Group essentially comprises a group of campaigners for a new way of working: to encourage us all to look at the world in a more realistic, and statistical, manner. The lack of available computing power is not as much of a barrier as it once was, and the tools to allow us to exploit statistical techniques are emerging and maturing.

There is an argument to say that we don’t need to change what we do, that everything is fine as it is, and that the various safety factors inherent in most procedures adequately account for uncertainties.

Certainly, if we are to embrace these methods, we will need to develop entirely new ways of working, and of thinking. For example, we would need to move away from the way in which we tend to compare an analysis with a physical test and see if we get agreement, and to start to think in terms of a "cloud" of points (or a distribution) of results for both test and analysis.

If we can successfully adapt in this way, then there are further clear advantages in making better and more reliable use of optimisation, and also in being able to do a better job of "designing to cost", both of which are discussed within this issue.

Tim Morris, Chief Executive,
January 2008