Author: C. Rogers (CREA Consultants Ltd)
1 March 2016
Standard: £9 + VAT
Online purchases unavailable
Unfortunately we are unable to process online purchases at this time.
Find out more
Members/Subscribers, log in to access
C. Rogers (CREA Consultants Ltd)
The underlying questions are: how do we know that the results are reasonably correct; and who is responsible if there is a failure? The answer to the second question is that the Chartered Engineer leading the design project is responsible; the answer to the first point is quality assurance. For engineering analysis, the major tools for quality assurance are the twin processes of verification and validation (V&V). In very simple terms, verification is the demonstration that the mathematics and numerics are correct; validation is the demonstration that the idealisation of the physics is correct.
As V&V underpin any quality assurance system, this paper discusses the need for V&V and indicates who is primarily responsible for the two processes: software developer or design engineer.
Not so long ago, a journalist asked me an interesting question: “Do you believe the work of the structural engineer can ever be replaced by artificial intelligence”. I think she was somewhat taken aback when I answered “Yes”. But before the esteemed readership of this magazine floods Verulam with missives of indignation, let me explain that I qualified my answer; I postulated that while almost all the technical work undertaken by structural engineers at every level could, in theory, be overtaken by artificial intelligence (and that it would be highly complacent of us as a profession to assume our more “left brained” tendencies were irreplaceable) the art of the structural engineer would always remain. Which begs the question, as structural engineers, what do we really mean by design? When I was at university over 30 years ago, much of our course work was taken up learning the hard, number-crunching ways of analysing structures, while “design” lessons generally involved practising the use of codes and standards to select and detail structural elements. For the 21st-century structural engineer, these are processes which can now be almost entirely automated. Our real value comes in understanding when and how to apply the increasingly complex tools at our disposal to deliver value and creativity to our clients and stakeholders. So in this special issue of The Structural Engineer, we set out to describe how far our profession has come, and where it might be going, in the development of digital design tools, and what this might mean for structural engineers of the future.
All the articles published in the March 2016 issue.
Finite-element analysis involves inherent approximations and numerical errors. In addition to these, the increasing size of structural models and the use of automated workfl ows for creating them can lead to hidden user errors in these models. In order for an engineer to have confidence in the analysis results, it is necessary to be aware of how these errors manifest themselves in models, what impact they have on analysis results and, most importantly, how they can be detected. We present novel numerical techniques that the analyst can use to “debug” their models and verify the accuracy of their analysis results. These techniques have been implemented in software and have been successfully used by practising engineers working on real life projects.