Gernot Boiger is a professor of Modeling Multiphysics Applications and head of Multiphysics Modeling and Imaging at Zurich University of Applied Sciences -ZHAW. He teaches and develops Multiphysics simulations for applied R&D and leads a team of around 20 researchers. He's also European vice president of the International Society of Multiphysics.
We discussed his upcoming keynote at the NAFEMS World Congress 2021. Gernot gave us an insight into his work and the role that he sees Multiphysics playing in the evolution of engineering simulation.
I'm a professor of modeling multiphysics here at University of Applied Sciences. I also run a research area called Multiphysics Modeling and Imaging, with 15 to 20 engineers, mathematicians, physicists, natural scientists and, being a University of Applied Sciences, we do a lot with local Swiss based, European and international industrial development partners. So those partners come to us with their problems and we do the best we can to provide high level, simulation-based process and product development for them. Obviously, most of our application cases surround multiphysics; multiphysics analysis, multiphysics prediction, multiphysics simulation.
To return to your question; what brought me to the field of multiphysics? In retrospect, this has probably always been kind of in my genes, this multiphysics train of thought, even though when I was little, I didn't really realize it. I have always been someone who likes to view certain problems, certain events under different aspects from different angles, from different viewpoints, if you will. And that's a huge thing, if you ask me, within multiphysics: to view complex technical problems under multiple angles of perspective, and incorporate not just one single physical aspect about these problems, but really many types of physical aspect.
I have been in this business since starting my Ph.D. back in 2005. At that time, I was already working with a university-based spin-off company that was basically doing the same as what we do now in our group. Over the years, the kinds of demands being made of Multiphysics have changed, and the amount of those demands is actually dramatically increasing. Back in 2005 it was, for many industrial development partners, the pinnacle, the height of sophistication to do, perhaps, a CFD analysis, a simple flow analysis. As technology is progressing at such a fast pace, faster than ever before in human history, we see that the problems that those industrial partners are now facing are getting more and more complex and thus the demands for more complex, multifaceted analysis grows. And that's just what multiphysics analysis and simulation is all about.
First of all, I think it's important to get expectations right. When we look at our latest Multiphysics simulation and analysis tools, as sophisticated as they are, they remain tools. So industrial partners must not have the expectation that there is something out there that when you push a button gives you a 100 percent reflection of the world we live in with a 100 percent accurate estimation or prediction of what's going to happen with your complex multiphysical process. This is not how it works. A tool, per definition, is something that helps you and other people to understand what the problem is all about. So, if you will, it's like wielding a highly sophisticated hammer, and the best hammer is not good if the person who wields it does not understand how to use it.
Returning specifically to your question, it's extremely important to start out right when you want to start using multiphysics analysis tools on a serious and broad basis for your problems. You first need to invest in human resources or in good external advisors who really know what they're doing. You need to invest in hardware and you need to invest in software. You need to look out for cloud computing options, for instance, if you don't want to invest in local hardware. You need to take into account all of these things. And mostly you need to, I think, establish a validated basis for your simulations - not just colourful images, because if you go down that route, you will go round in circles and five years later you will start at the same point where you began.
Having said that, once you go through that process, once you have established that validated basis for a multiphysics analysis of your problems, in the mid-term and the long-term, this will pay you back tenfold, if not a hundred-fold. This is because the sheer amount of experimental work that you will be saving and the sheer amount of extension of the borders of your understanding about the technology, while it is hard to put in exact money terms, will certainly be in the range of a factor of 10 -100 of what you were previously able to do.
I would say very. For one, we live in the information age, the great age of connectivity is upon us. So, at the tip of a finger, so much information is available to anyone. Also, high quality information in part about the field of Multiphysics is available; everyone can do their Internet research these days. However, it still needs, and will always need, organizations like NAFEMS or the International Society of Multiphysics to establish that solid core basis of know-how, of standards, of how it is done. Because otherwise, if those organizations do not exist, any field these days is in danger of drifting off from the main scientific core into something that is mixed up with opinions and even politics. And I think that's the worst thing that could happen to a scientific field, especially when we are talking about physics.
One hundred percent.
I think they create an extremely outstanding vessel to convince people of ideas and trends. So, whether you're new in the field or you're an expert in the field, it's interesting for both groups. if you're new in the field, what better opportunity to get a perspective on who the main players are, the bearers of knowledge, of know-how and ideas, than to go to one of these events? Within an extremely short amount of time, you get so much more input than if you go out searching, even on the internet. Then there is also the aspect that I talked about before: quality standards. If you go to these events, the people that are participating there, that are talking there, they know what they're doing.
Only in personal interaction do you get an impression of what these people can do and perhaps also an impression of what the limits are. It has always been the best way of exchanging information and thoughts and ideas for humans to interact, not just online. Online is getting more and more important, but it must also be in-person where possible.
Absolutely. One very important aspect of what I do at the University of Applied Sciences, besides doing all the applied R&D, is teaching. Teaching engineering students what multiphysics is all about, or at least giving them a taste of what it is about. This is probably from even before my time, but certainly what I have seen in the 10 years that I've been doing this is that young students, not only undergraduates but also more advanced, more highly educated students at lower Ph.D. level, share a similar problem. That is, they lack an overview of the different fields of multiphysics. Often young Ph.D. students, even if they come from outstanding technical universities, might be highly educated in, for instance, three-dimensional flow analysis, three-dimensional structural analysis or electrodynamics, thermodynamics or chemical thermodynamics, but they might lack an understanding of the overview; how it all connects together.
So, over the past few years I, indeed, we have come up with a teaching scheme for, as simply and as easily as possible, providing that overview, those common similarities between the different fields comprised within multiphysics. I will be talking about that scheme. It all starts with Gibbs, one of my favourite scientists, the archetype multiphysicist engineer/scientist, if you will, and his fundamental equation that sums up all kinds of multiphysics effect in one single energy equation, which effects contribute to the global change of energy of any system. And from that we've constructed a graphical modeling scheme. Originally meant for 1D dynamic processes, it can also be used to model any multiphysics system as well as visualize and understand the connectivity between those systems.
My hopes are that even highly specialized colleagues that are extremely well versed in highly complex, three-dimensional analysis will take something from that presentation in the sense that it's a simpler way of looking at things, but also a higher-level way of looking at things again, where you get a broad overview back, that maybe you have lost sight of over time.
As I have already insinuated, in general the field is growing rapidly, the challenges are evolving at an immense rate. Problems you're looking at today are much more complex and multifaceted than they used to be, and we need to adapt to that. We are working on the means to make that possible. One of the things we are doing currently is working together with industrial partners from different fields, be it pharmaceuticals, medtech, classical engineering, you name it.
We are also working on technologies that will help us to cope with the challenges and the potential of upcoming artificial intelligence. We do not perceive artificial intelligence as something to replace multiphysics analysis, but there will certainly be a merger of those fields in the next few years. So, who is going to provide all the data it takes to train all those neuronal networks?Experiments alone can’t do the job. So, what about highly validated, highly sophisticated simulation models? Why not use them to train artificial intelligence, for instance?
That's one big thing we’re going after. Another big upcoming trend, and I have seen this coming for quite a number of years, comes from meteorology. We can all learn from the meteorologists, they've been thoroughly tested for decades by a very tough audience, basically everyone out there! They are used to thinking in ensembles i.e., ensemble computing, because they know very well that the better models rely upon parameters and boundary conditions and do not have 100 percent certainty to them.
So, whenever you do a model-based weather prediction you do a number, potentially a vast number, of simulation runs, and then you get a spectral overview of all the possibilities. Then you talk about averages, about means, about deviations and about probability. What I see in the engineering field is that we have neglected this a bit. People still tend to do one single simulation for complex processes and try to draw all kinds of information and predictions out of that one single simulation, when in reality there is a lot of uncertainty to these predictions, to our models as well, and we should be honest enough to acknowledge that and respond to that. So, the best response is to not just do single individual simulations, but to, on a standard basis, do chunks of simulation, spectra, angles of perspective etc.
Now, how are we going to achieve that? Certainly, you do not want your high-level employees seated in front of your cluster or even in front of a cloud computing machine doing simulation after simulation. To me, the key is what we call massive simultaneous cloud computing. That means right now we’re after tools -and we have been for several years, which I think is a pretty hot topic- that enable us, on a user-friendly basis, at the push of a button to conduct dozens or even hundreds of simulation runs, a spectrum of simulation runs, simultaneously in the cloud on as many cloud computers as you wish and then retrieve statistically relevant information from these runs.