Reports of OT’s takeover by IT are greatly exaggerated

Posted on 8 Jan 2019 by Jonny Williamson

The increasing influence of IT in what were traditionally operational areas of manufacturing has led some to question whether the distinction between OT and IT no longer exists.

Inevitably, your view on that will depend on circumstance, but it might also explain some of the reluctance among traditional manufacturers to embrace the new digital age, perhaps resisting the IT tanks that they see parked on their lawn.

Dave Laurello, CEO of Stratus Technologies, is in a perfect position to judge. His company, based in Massachusetts, focuses on using IT products to protect manufacturers from points of failure in their systems, particularly using ‘Edge’ computing to provide real-time intelligence on machine performance.

The Manufacturer’s Editorial Director, Nick Peters, met up with Laurello on a recent visit to London.

CROP - Server Room IT Digital Technology - stock image

Has IT finally made OT redundant?

Dave Laurello: No. Much like consumers, OT operatives are becoming increasingly IT literate, and future generations of OT will be digital natives, expecting the equipment they use to be IT enabled for operational and maintenance purposes.

I think the traditional lines between OT and IT will become blurred over time, not overnight. And I really don’t think it’s a reluctance on the part of OT to accept IT, I think it’s that the OT folks have a set of priorities that are around their business.

They’re judged and held accountable for what they produce, and sometimes they think bringing in new technologies can be a risk to that. They want IT enablers for OT applications to be proven and they want to see the benefit of changes immediately.

Of course, IT is no stranger on factory floors; its use has been growing for years across a range of OT activities. It’s now more a question of harnessing its full potential.

When you look at a lot of manufacturing facilities today at the plant level, they have a lot of server sprawl. They’re running programmes, they’re looking at the data, but it’s a very human-involved process: they monitor a process then a human being will go take action.

What we’re finding OT folk are sensitive to is they have many applications running on different servers and on PCs. This creates a management and upgrade challenge, and of course a patching and security challenge.

Clearly there is an opportunity for them to virtualise these functions because it makes their operation more efficient. An OT person can relate to such improvements, since overall efficiency is one of the ways they are judged.

This article first appeared in the Dec/Jan 2019 issue of The Manufacturer magazine. To subscribe, please click here.

Industrial Data Summit 2019: Insight at Scale

Executives gathered to discuss how best to take advantage of and leverage the power of digital technology at Industrial Data Summit.This uniquely interactive one-day conference allows delegates to choose which conversations to join and which speakers they sit alongside, ensuring you only have conversations meaningful to you and your business’ goals.

100 data-minded executives from the UK’s leading automotive, aerospace, defence, electronics, pharma, food, drink and electronics manufacturers, coming together to talk about the evolving role of big data in their businesses.

Click here to secure your place | 13 February 2019 | Birmingham

Virtualising on highly available or fault-tolerant servers provides a reliable and scalable platform to bring in newer applications, such as real-time analytics, real-time artificial intelligence – you have a base platform through virtualisation and simplification at the computing level that you can build upon.

A lot of our customers are going to that first stage very willingly because they see it as an operational efficiency, not as some revolutionary technology or changing the process that they have.

We in the UK are a bit down on ourselves, that we’re not as fast at adopting new technologies. Actually, that was true in the third industrial revolution – we have something like 70 robots per 10,000 workers, while in Germany they have 350. But do you think that the world is racing ahead on industrial digitalisation leaving Britain in the dust or are we not alone in our slight reluctance and hanging back.

My sense is that it’s a cautious adoption among smaller companies while some of the larger companies are doing a lot of proof of concepts (POCs).

A lot of companies are struggling with ‘what do you do with the data that you collect?’ Because the hard part is not collecting the data. It’s really determining what to do with the data, needing a lot of data scientists to come in and start looking for patterns.

I still think that a lot of these technologies are in a very early phase. Maybe some countries are more aggressive with POCs than others, but I do think this caution is widespread.

The time will come when we hit an adoption curve that will ramp up very quickly. I think the key is for people to be ready when that adoption curve hits by doing their POCs now.

Perhaps we are being unfair to companies who are reluctant about adopting 4.0 technologies. Perhaps they are fed up with receiving all this jargon.

There’s a lot of technologies coming at the OT people. We’re talking to them about Edge computing, cloud computing, hybrid models – for OT folks in some smaller firms, that’s not exactly easy to comprehend and understand. I think that also drives some of the cautiousness.

CROP - Edge Computing AR Data Storage Cloud Sensor - Stock

Stratus is really making a thing about Edge computing, while others talk about the Cloud. The Edge is computers gathering immediately relevant data from machines, the Cloud is where deeper data analysis is done. With bandwidths widening thanks to 5G, won’t everything soon be done in the Cloud?

We look at it this way. There are all these smart devices out there collecting data. Right now, there’s more smart devices than there are people in the world and by 2020 it’s estimated to be somewhere between 30 – 50 billion. Pick the number, but it’s a lot. In 2020 those 30 billion devices will generate 11 zettabytes of data. (A zettabyte is a billion terabytes.) That’s just 2020.

When you start talking about those magnitudes of data, there’s a few things you have to think about. There’s not enough bandwidth in the world to bring all that data to the Cloud, especially as it starts to ramp up even beyond that.

The second thing is latency. If you’re trying to do things in real-time activities, sometimes the round-trip path with the Cloud is too much. The third one is it’s expensive to compute in the Cloud, especially if you’re being charged per transaction. Some studies show computing at the Edge is about 30% – 35% less expensive than computing in the cloud.

That’s the basic situation we have now: real-time processing at the Edge, deep analysis in the Cloud. What will ultimately happen is that there will be a proliferation of Edge computers in plants. How do you manage all these Edge computers? We’re now seeing a lot of Cloud vendors saying, we can do that. We can provide that single pane of glass that allows you to manage all these devices.

To what extent will AI turbocharge Edge computing and assist manufacturers in developing competitiveness from real-time analytics?

AI is certainly one of the drivers for the adoption of Edge computing solutions. In fact, IDC forecasts that spending on AI and machine learning will grow from $12bn in 2017 to $57.6bn by 2021, and a good portion of this growth will be associated with Edge computing solutions.

I don’t think it’s a revolution. I think it’s an evolutionary step for our whole manufacturing process, and supply chain process. It’s a matter of pulling all those things together and making decisions based on data versus experience.

Importantly, AI and analytics at the Edge is not just for larger organisations. They can be employed by smaller firms regardless of whether they use or plan to use the Cloud or big data.

To effectively utilise the power of AI at this level – and indeed most IIoT technologies – the starting point must be to bring infrastructure up to speed. This typically involves upgrading networking so that information can flow easily to the systems processing it at the Edge. Then organisations can deploy sensors to collect data and the analytics to make sense of it.

CROP - intellectual property digital economy innovation light bulb - image courtesy of Depositphotos

From there, data scientists and, increasingly, software solutions, can be used to implement optimisations to help drive productivity improvement. As the direct benefits of AI at the Edge and at the Cloud become better understood – and as more case studies exist to show the benefits of it – more organisations will be deploying Edge and Cloud computing to fully leverage the gains available.

Final point about skills. You mentioned the need for a new cohort of data scientists to come in and manage this data processing – and they’re just not there. Certainly in this country we’re just not turning out enough of them. Might the 4.0 revolution be a victim of its own success in that there’s just not enough people to manage what it produces?

No, I think we’ll work through this. I also think this is being seen now as the hot spot, the spot to be in. There’s a lot of start-up companies out there looking for talent. So, once that cycle starts, it builds its own sense of momentum.