- About NAFEMS
- Technical Groups
- Regional Groups
- Vendor Network
- Cookies Policy
- Resource Centre
- BENCHMARK magazine
- NAFEMS Glossary
- Code Verification
- German magazin
- Exploding Dough And Fun With Carrots
- Stick Forward, Full Opposite Rudder
- FEA for Managers & Reviewers - Q&A
- Excuse me, how do I switch it on?
- A Train of Thought
- The “Usual Suspects”
- Anecdotal Evidence
- Clyde Cessna's Famous Photo
- Unwind With A Little Post-Crash Mozart
- Shot-Peening Over Coffee and Pie
- Recent live NAFEMS on-site classes and other updates
- Simulation Driven Design: Just How Far Have We Come?
- Analysis Origins - ABAQUS
- The NAFEMS Benchmark Challenge - 05
- Hello NAFEMS Folks
- Key to Championing Simulation
- Does Multi-Physics Make Fools of Us All?
- The Best Simulation Toolbox: Integrated Suite or Granular Apps
- The Four ROIs of Simulation
- Pre-CAD Simulation: Where True Engineering Occurs?
- Systems Simulation: Far-off Future or Feasible Now?
- The Implications of the Cloud for Simulation
- Continuous Systems Validation- Implications for Software Solutions
- Interview with Ralph Sundermeier
- Continuous Systems Validation- System Simulation Configurations
- Continuous Systems Validation- Progressive Design Representations
- Growing Pains
- Earth, Wind and Fire - Elementary 002
11 July 2013
Continuous Systems Validation - System Simulation Configurations
Ready for more system simulation?
Well, strap in. This is where things get complicated.
In the last post, we dove into the progressive means of representing and assessing designs in mechanical, electrical and software domains. We found that in each discipline, the representations changed. We also found that the means by which each design representation is assessed progressed as well.
Now why is that important?
Simulating Systems by Leveraging Discipline Specific Assessments
When it comes to simulating systems, you can always create a system model that is representative. You can use that model to simulation a system's performance. However, and here's the catch, the holistic system model will always lack the fidelity of discipline specific representations and assessments. Essentially, I'm saying that the system model will never predict the behavior of the mechanical design more accurately than the mechanical model.
Instead, what some organizations are trying to accomplish it assemble system models form the disparate discipline specific models. But there are two challenges that await such organizations.
Connecting Simulations for a Holistic Picture
How do you connect these various design representations and assessments to each other? They need to interact. Here's a few examples to drive the point home.
- The logic of the controller should affect the behavior of the mechanical simulation. And the opposite should be true as well. With sensors, the resulting behavior of the mechanical simulation should act as an input for the controller software.
- When the board or processor overheats, sensors communicating with thermal management software should kick a fan on or circulate cooling liquids.
And the list of interactions can go on and on and on. At some point before the design gets released, those interactions need to be vetted. And if you're spending big bucks on building prototypes or testing labs, you don't want the first glimpse of true system performance to occur in the test environment. You want it to be a last validation.
Connecting Hardware for a More Complete Picture
Furthermore, it's not just about connecting all the simulations together. As mechanical or electrical hardware becomes available, you want to be able to plug that in and perform tests. That means you will have various combinations of software-in-the-loop and hardware-in-the-loop. Which leads to the most fundamental and important concept in this three part blog series.
Advancing with the Design Progression
Connect these concepts together and you realize that there is massive and dynamic configuration management problem at the heart of progressively validating system performance. Here, let me show you a few examples to make my point.
- Early on in development, you need to connect a 2D sketch of the mechanical design with kinematics and digital calculations to a 2D logic diagram of a board to a UML software model.
- In detailed design, you need to connect a 3D model with flexible bodies to a 3D board assembly that has a thermal cooling simulation with partially completed software code.
- Close to testing, you want to hook up a mechanical physical prototype to a prototype board and compiled software code.
All that is complex enough, but it gets more difficult. Organizations don't want to simulate systems performance at these three points, but continuously as designs progress from one representations to the next. What does that mean? Well, when the electrical engineering team goes from a 2D logic diagram to a 2D layout of the board, the system engineering team doesn't wait around for everyone to increment their design representations. They want to incorporate that representation into that early conceptual system simulation model.
Summary and Questions
Time to recap.
- Organizations are increasingly wanting to continuously simulate system performance instead doing it incrementally.
- Discipline specific simulation models provide greater fidelity than abstracted system models. Connecting these discipline specific simulation models provides the most accurate picture possible into system performance.
- Organizations are also wanting to replace digital simulations with real operating hardware when it becomes available.
- This combination of trends presents a huge configuration challenge for system simulations. Especially as different engineering disciplines will incrementally improve the representations and assessments of their designs.
Those are my thoughts. What are yours? How continuously does your organization assess system performance today?
Take care. Talk soon. And thanks for reading.