Manufacturers agree that data collection from sub-components will drive incremental performance improvement.
A gathering of senior manufacturing executives – including from JLR, JCB, Nissan, Unipart, Yamazaki Mazak, and Coty – has highlighted the role that the Industrial Internet is set to play in eking out incremental performance improvement in their operations.
Executives – sharing under ‘Chatham House Rules’ – noted that manufacturing’s fundamentals were broadly unchanged over the past couple of decades, and that more attention needed to be focused on manufacturing process innovation, and not just product innovation. Data collection from manufacturing equipment, down to the level of sub-components, created both opportunity and challenge.
The opportunity centred around the ability to build more accurate physics models, and drive manufacturing optimisation through incremental improvements. There was agreement that a global best practice performance benchmark of 85% still left a lot of room for value creation – as fractional improvements delivered compounded margin gains.
The challenge was identified as being whether data collected from connected manufacturing equipment was fit for purpose. As an industry, manufacturing creates – and then discards – more data than any other industry. If data is not valued by a company, it will not be able to take advantage of the step change in performance promised by 4IR (the Fourth Industrial Revolution).
Big data vs better data?
There were differing opinions on whether the right approach was to focus on managing less data better, or whether the advent of lower cost computing resources would enable ever more efficient analysis of companies’ data lakes. Likewise there was disagreement whether speed of analysis was something to be traded for comprehensiveness of the data set being analysed. It was reckoned that for every 10% increase in the volume of data being analysed, results improved by 3%.
What was accepted around the table was that the Industrial Internet of Things (IoT) had created a unique opportunity to ‘embed intelligence at the edge’ – in other words to first use cheap sensors to collect, and then secondly use distributed computing power to prioritise which data was passed on for higher level analysis.
The language of credit card transaction monitoring was used by one manufacturer to highlight the fresh mindsight required to pool data and then look for anomalous transactions in order to understand what was happening in your operations in real time, and then build probabilistic models for what could happen in future.
In summary – while there was apprehension about how best manufacturers should manage rising data volumes, there was still an appetite for achieving a more granular view of operations. Whether this came from a more collaborative approach to data-sharing across the supply chain, or a deliberate search for ‘unknown unknowns’ – the desire for faster, better decision cycles was the underlying theme of the evening.