Why the Built Environment Needs Innovations in Data-Centric Engineering

Author: Dr Ramaseshan Kannan - Head of Computational Science at Arup

Date published

23 May 2024

Why the Built Environment Needs Innovations in Data-Centric Engineering

Tag
Author
Date published
Price
Blog
Author

Dr Ramaseshan Kannan - Head of Computational Science at Arup

Date published

23 May 2024

Author

Dr Ramaseshan Kannan - Head of Computational Science at Arup

This blog offers informed opinions and perspectives relating to nascent technologies in data-centric engineering. Dr Ramaseshan Kannan (Head of Computational Science at Arup) discusses how, despite first appearances, the built environment requires rapid developments in Data-Centric Engineering.

It is no secret that the built environment faces enormous challenges globally and from a planetary perspective. On the one hand, the sector is responsible for up to 40% of all carbon emissions; on the other, there is increased demand on our built infrastructure — from rapid urbanisation and natural calamities such as earthquakes, floods, or hurricanes. In turn, there is a two-pronged challenge of decarbonisation and resilience. It is here that Data-Centric Engineering — which I see as an interdisciplinary field including computational science, data science, computational statistics, optimisation, engineering and more — will have a crucial role to play, as I have written in this blog.

The built environment is often seen as a field that doesn’t need to be on the cutting edge of science. But this misperception couldn’t be farther from reality! There is a variety of complex problems that need a range of computational techniques to be solved. A number of these are unsolved problems or might be on the research frontier at low Technology Readiness Levels. As such the techniques are effectively non-existent or unavailable to the industry. On their part, research funding bodies and industry must realise these are not Instagram-style problems — rather they need ‘deep tech’ innovation and investment.

So, what kinds of AI and science does the built environment need to tackle emerging challenges? Through the lens of my work at Arup and our collaborators from academia and industry, here are a few examples.

In each case, I present a set of broad problems in the application domain, and hypothesise an (incomplete!) set of innovations we might need to leverage.

We have ageing assets whose design, condition or degradation are undetermined. Our products are prototypes that are prone to highly adverse consequences of ‘getting it wrong’.

  • Scientific problems: new ways to blend physics and data; new techniques to predict and prognose; alternative ways to reconstruct system behaviour from noisy, disparate data sources; techniques to simulate and understand physical behaviour — especially in safety-critical scenarios.
  • Innovations: new forms of uncertainty quantification; scientific machine learning; statistics; computational mechanics; novel material models; multi-modal learning.
We want to prototype designs faster to aid a better exploration of design space — especially in the early stages of a project ­ — to reduce embodied carbon by using materials more efficiently in Pareto optimal settings.
  • Scientific problems: new ways to rapidly simulate and optimise for multiple disciplines, while drawing high fidelity machine learning-enabled simulations.
  • Innovations: new ways to reduce computational requirements through surrogates or reduced order models; probabilistic machine learning to quantify uncertainties; techniques to solve combinatorial optimisation; data-efficient learning in high dimensional spaces.
We have sparse data, and our designs arise from a creative process that owners don’t want to share. Each building, bridge, or city masterplan is unique.
  • Scientific problems: learning from spatial and geometric data such as 3D models; leveraging trained models across projects, and doing so while respecting intellectual property.
  • Innovations: new forms of transfer learning, learning on graphs, differential privacy, generative modelling.
Our assets are often systems of systems, ingesting and producing vast amounts of disparate data. We must use this information to make better decisions.
  • Scientific problems: ways to mix many kinds of data — sensors, images, video, meteorological, anthropic, geospatial, geotechnical, environmental — to support decision-making for growing urbanisation and infrastructure, and to become resilient to climate events.
  • Innovations: multi-modal learning, multi-fidelity, multi-hierarchical modelling, belief networks and much more!
The infrastructure we design and maintain has ever-changing demands. Whether it be a century-old bridge, carrying far heavier traffic compared to the original specification, or an operator looking to provision offshore wind farms. We must deal with unpredictability in how assets are utilised.
  • Scientific problems: better ways to predict and account for uncertainty in demand.
  • Innovations: better techniques for demand forecasting, uncertainty quantification, agent-based modelling, stochastic optimisation, and sequential decision problems.

I’ve only covered a tiny section of the entire set of possible areas.

We need a fundamental shift in the technology stack for dealing with data and computation at scale. As an example, a typical parametric study to solve an inverse problem (or to train a surrogate) might involve several thousands of runs of a simulation solver, and a large dataset to accompany. Traditional supercomputing and HPC facilities (whilst popular within academia) are usually uncommon in the built environment industry. The flexibility and agility afforded by public cloud services appear the most optimal solution to this problem. The advantages of high-performance computing on the cloud are hard to ignore; however, doing so needs modernisation, upskilling, and capability building. As an example, Arup works closely with AWS to solve some of the above problems at scale.

So, with a rich tapestry of applied and fundamental problems to be solved, work is needed at both an application level as well as methodological level.

Dr Ramaseshan Kannan is Head of Computational Science at Arup.

Additional information

Format:
Blog
Publisher:
IStructE

Tags

Software Blog Sustainability & Environment Conceptual Design Buildability

Related Resources & Events

Blue abstract blocks

EEFIT/ GEM lecture: Earthquake loss modelling for disaster risk management

Lecture by GEM on earthquake loss modelling for disaster risk managment.

Date – 20 May 2024
Price – Free
Report
Blue abstract blocks

EEFIT Mission report: Turkey earthquake sequence February 2023

This EEFIT Field Mission report, and extended summary in Turkish, details the effects of the Mw 7.8 and Mw 7.5 earthquakes which struck southeastern Turkiye and northern Syria on 6 February 2023.

Date – 6 February 2024
Author – Various
Price – Free
Blue abstract blocks

Build Change: The engineer’s role in improving global housing resilience

This lecture provides an overview of the global challenges to resillient housing and proven solutions for risk reduction.

Date – 11 January 2024
Author – Louise Foulkes, Build Change
Price – Free