This paper was produced for the 2019 NAFEMS World Congress in Quebec Canada
Science-based simulation tools, where physical laws are explicitly coded and solved using numerical techniques such as finite element analysis, are widely used. Recent advances in artificial intelligence and machine learning algorithms as well as the emergence of “big data” have resulted in the development of data-driven models where domain knowledge is not necessarily required. Unfortunately, these “theory-agnostic black-box data science” models have achieved limited success in engineering applications. Theory-guided machine learning (TGML) is an emerging approach that integrates domain knowledge with machine learning. To introduce physical consistency in training machine learning models, this approach incorporates the physics of the problem using proxies such as physics-based features, model architecture, activation functions, loss functions, and constrained response surfaces to achieve higher-fidelity models while requiring less training data.
Composites process modelling can be a computationally expensive task. Typical process simulations are multi-scale and multi-physics in nature and require the simulation of real-world processes that occur over many hours. For example, a typical process simulation of the curing of a composite part might involve a thermal analysis of a part, its bagging material, its tooling, and the tooling support structure in an oven or autoclave. After the thermal simulation, which calculates the temperature and material properties everywhere, a saturated flow analysis would be performed, capturing the resin flow and fibre movement during the curing process. Then, building on the thermal and flow analyses, a mechanical simulation would be performed to determine the ultimate part deformation (spring-in) and residual stress state. Serially-coupled analyses of parts at an industrially relevant scale may take several to many days to complete, even on HPC clusters. The ability of engineers to run multiple simulations to explore the design space and create optimized process parameters is severely limited by this computational expense.
TGML models can be trained to simulate simplified process models. For example, the manufacture of a single-layer composite part can be represented by 8 input parameters. In the case of thermal analyses, process key performance indicators (such as maximum exotherm temperature, minimum viscosity, etc.) may be identified. Using TGML, it is demonstrated that a small and manageable number of training simulations can be used to train a model that accurately captures the behaviour.
With TGML models that simulate aspects of the full composites processing simulation chain, numerically efficient tools can be used to evaluate designs for manufacturability, even at the preliminary design stage. These tools take advantage of the speed and accuracy of the TGML models to quickly perform many analyses that fully explore the response of a given part design given typical process parameters in order to assess the likelihood that the part, as designed, can be manufactured within the required specifications. It is demonstrated that the speed-up of such tools resulting from the use of TGML models rather than finite element simulations makes the tools practical for use in industrial practice.