Skip to content
Type
Civil Engineer blog

The construction sector's interoperability challenge

Date
09 April 2018
David Owens, Senior Consultant Digital Transformation at Costain, explores the challenges around data exchange in infrastructure and what we can learn from the past.
The construction sector's interoperability challenge
Interoperability is defined as 'the ability of computer systems or software to exchange and make use of information' and is an emotive subject to many of the people I’ve met in ‘digital’ circles.

Most of us as engineers can relate immediately to the challenges around managing asset data and there is a view that a single interoperable file could be the holy grail to our industry’s digital problems.

The solution for the exchange of data in the future won’t be the solution we’re working on today. We’re on a journey and I’m certain that greater understanding of computer science at all levels is needed in construction – it needn’t be a specialist subject.

Collaboration from within and from outside of the industry will be key to the development of interoperable asset data we can all benefit from.

I was recently part of an industry roundtable on this topic, convened by ICE and Bluebeam, which debated the challenges and possible solutions.  As a follow up this blog is aimed at those engaged in shaping the future of the industry I love and am equally frustrated by.
 

Roundtable video: interoperability

As the industry continues to grapple with this challenge it’s worth taking a little look back at the evolution of the internet for lessons in adopting a ‘data centric, and system agnostic’ approach.

My own philosophy has been shaped by a number of milestones and visions. They are worth highlighting as we develop our journey to achieving interoperable asset data through taking a series of baby steps…
 

World Wide Web Consortium  (w3c)

The history of the internet is fascinating. The launch of w3c happened in my ‘recent’ memory, I was just finishing secondary school in the mid-1990s when it came along, changed everything and became the cornerstone of our modern world.

Its foundation was based on interoperability - fundamentals created by Sir Tim Berners-Lee.  Several browsers were born which used these principles to interpret code (html) and present it to the reader. I continue to use the wonderfully simple resource of w3c schools  to help my understanding of the workings of the world wide web.
 

Open Data Institute

Nearly 25 years later the internet continues to impact all our lives, from Google, Amazon, Facebook and Twitter. The ability to code is now taught in secondary school (Python typically, I was lucky to do a little C# at university).

The Open Data Institute is the champion of interoperable data across the world, advocating its benefits for both for commerce and democracy.

Its vision speaks of the good data can do but also the harm that can be caused: "Our vision is for people, organisations and communities to use data to make better decisions and be protected from the harmful impacts of its inappropriate use and distribution.”
 

BuildingSMART

In the same 25 year period BuildingSMART has been the champion of interoperability in our own specific domain.

BuildingSMART maintains (on our behalf) a ‘data standard’ (ISO 16739) and seeks to provide a definition and schema for data in construction. You may have already heard and seen 3D graphical models in ‘.ifc’ (Industry Foundation Classes) but what is most interesting and least exploited is the construction processes this has defined.

My personal favourites are Risk  and Warranty  and by virtue of these two, why couldn’t that information be passed easily between parties as we are obliged to do so under HSE Construction (Design & Management) regulations?
 

Data for the public good

The National Infrastructure Commission’s recent report speaks of the virtues of data and the improvements in productivity that can be won and states that new data driven technologies currently contribute significant benefits to the UK economy of up to £50bn.

But simply having the data is not enough. It needs to be shared across the public and private sectors with the appropriate levels of secure access to enable its value to be fully leveraged for public benefit.
 

What I have learned and hope you will too

  • Databases are very good at holding… data
  • The data schema or data model is more important than its container
  • Extract, Transform, Load (ETL) tools are very, very useful
  • Data interoperability may eventually come in the shape of an A Programming Interface (API) (well, it worked for Jeff Bezos at Amazon )
  • Be grateful to all of the collaborators out there
Why then, when something as complicated as the internet is dominating our lives can't we develop a solution to interoperability? What are the market conditions we need to break the silos of proprietary file types? This is the challenge to industry.


Further information

W3C
Open Data Institute
BuildingSMART
National Infrastructure Commission
Jeff Bezos Amazon
ICE digital transformation campaign
Bluebeam: Strxur