Digital Engineering: We Need More than Just Sensor Data to Model the Real World

    Dave Dungate

    Topics:

    Digital Transformation Data Driven Engineering

    When digitising products and assets, data science needs more scientists

    As every tech company will tell you, the future of engineering is digital. IIoT, AR, analytics bring new capabilities to design, manufacturing, operations and maintenance of everything from batteries, to substations, to production lines, to oil rigs.

    At the heart of this is, of course, data. Sensors gives us unprecedented insight into what’s happening in our products, allowing us to improve designs and predict problems. The ultimate goal for many is a digital twin.

    But there is a trap. Data is often sold as having all the answers. But your sensor data is an incomplete approximation, not a perfect representation of the real world. Instead of collecting as much data as you can, approach these problems as scientists.

    Take a car battery. A data view would look at things like energy in, energy out, and temperature profile. It could then see whether temperature changes affect efficiency.

    A scientific view would start with a physics-based model of the battery, considering things such as how fluid dynamics affect how heat will propagate through the system. It would then collect the most valuable data points to understand what is really happening.

    This has two advantages.

    Firstly, your models are closer to reality, so you understand your asset better, and can spot more useful insights. A pure data model will only get you so far, but combining data with a physics model allows you to infer much more about the system.

    Secondly, if the physics-based model is constrained by reality, that makes the data management task simpler. You immediately rule out correlations that may seem worth exploring in a database, but are irrelevant to the actual problem, or which don’t fit the laws of physics. This saves time, eliminates bad data, and focusses minds.

    A salient warning from the past comes from the infamous 2008 Wired article which predicted ‘The end of theory’ and that ‘with enough data, the numbers speak for themselves’. But it turned out the real world was messy, and simply relying on data without context led to big errors – remember Google Flu Trends? Data is valuable but it needs to be built into models by people who understand real world constraints.

    Faster data analysis and digital R&D can transform the future consumer goods.  Discover how in our in-depth guide.

    space-launch

    Lessons from space missions on real-world modelling

    For an example of where this is done rigorously, we can look at the space industry, which needs to accurately model how spacecraft will behave, before it sends them into space (where changes become tricky).

    When Tessella was developing the control algorithms for ESA’s Solar Orbiter mission, we broke down the overall spacecraft, its sensors and actuators, and its environment, into separate elements. For each element, we’d start with a basic model of what that part of the craft looked like. We’d look at mass, dimensions, the physics of how thrusters and actuators work, and so on, to build models based on knowledge of dynamics in space.

    The first model would be based on initial data on the mission design, which suffers from many uncertainties at that point. These are incorporated into our models, and used as the basis to design our measurement and control algorithms so that the performance is robust to that uncertainty. As the spacecraft matures, further data is received from the design team which we feed these into our model, reducing uncertainty and allowing iteration and optimisation of algorithms.

    Gradually we’d add to model sophistication to understand second order effects. A thruster model may use simple Newtonian physics to project how actuations translate into thrust. But in reality, some of the plume may interact with elements of the spacecraft, causing real world disturbances that we have to take into account in our algorithm design.

    Once we have an initial design, we’d test this through simulations, checking whether it performs as expected. These almost always throw up things that have been missed, but a good understanding of the underlying physics allows you to quickly spot and address the problem, understanding where your assumptions may be wrong and how to evolve the design.

    This approach is already widely deployed for designing cutting edge products where small improvements have big benefits, such as spacecraft or turbine blades. These same physics-based models can now play an ongoing role in monitoring complex systems, and modelling changes to them.

    Applying lessons of systems modelling to digital engineering

    Such projects involve looking at the system before us and asking, ‘what do we want to know?’. Then it’s a case of breaking down requirements across the system, designing and pulling together a suite of algorithms that work both in isolation and together, then combining and testing them.

    If we are building towards a digital twin, we will need to combine many of these models, so we can simulate how changes propagate through the system. The more accurately these reflect the real world, the more likely the digital version will provide trustable predictions.

    Sensors will be critical for gathering the data. But building the models themselves, as well as identifying what sensors are needed, requires an understanding of both the data and the underlying science.

    Those trained in computer science often look for correlations, which they use to learn about the physics of the system. Those trained in scientific disciplines already know the physics and can build the models up from first principles. Both can add value, but in our rush to digitalisation, it’s important not to overlook scientists and engineers who understand how to model the world before them.

    Digital acceleration in life sciences CTA