How are manufacturers dealing with using Big Data to form data-driven decision-making? A dozen manufacturing directors shared their perspectives, at a dinner organised by The Manufacturer alongside analytics firm FICO.
Of all industries, manufacturing is reckoned to generate the most data – and waste the most data too.
This was well reflected in a wide-ranging discussion of senior automotive manufacturers in Birmingham who, while mostly speaking under ‘Chatham House’ rules (no personally-identifiable comments), were forthright in the challenges of creating a data-driven culture for decision-making – and the necessity of doing so.
Executives from FICO, a leading analytics software company with six decades of expertise in decision management, were on hand to talk through some of the technology implications of embracing a more data-driven approach to manufacturing operations.
“There seems to be a perfect storm hitting manufacturing right now,” noted Richard Muratore – Manufacturing Practice lead of FICO. “Political uncertainty, favourable currency conditions and disruptive technologies are all coming together – and I’m excited to help you get value from these circumstances. Where you start will really affect how you move forward.”
Muratore then introduced the evening’s guest speaker from Jaguar Land Rover, a FICO customer, who shared some of the learning points from their own data-driven journey.
Peter Domeney, Manufacturing Engineering Director, Jaguar Land Rover, said that while he fully understood the importance of good data and good information, he felt that the variety of terms in use was both confusing, and served to obscure what the real opportunities were for manufacturers.
“Industrial Internet, Industry 4.0, Big Data, Fourth Industrial Revolution – there are more technology tags and straplines than ever before, but I’m sometimes not sure what the difference between them is,” admitted Domeney.
He went on to share an anecdote that encapsulated the enormous technology shifts taking place, and how these shifts challenged executives to think differently.
Executive Summary
- Business not as Usual: Why do you have a record collection? Embracing new business models.
- You Cannot Manage what You Cannot See
- Exploiting the Unknown: Probability & Performance Management
- Collecting and Analysing Data from the Edge
- Marginal Gains Drive Value Creation
- Thinking Outside The ‘That’s-How-We-Do-It-Round-Here’ Box
- Which Report Would you Prefer: End of Shift vs Week Ahead
- Levelling Up: Gamification & Incentives
- Fit Bit Your Team: Transparency & Performance Management
- Big Data: can you have too much of a good thing?
- Automation vs People – Do we need to take people out to be competitive?
- Risky Business: If you never fail, are you trying hard enough?
“I was busy converting my LP collection to audio files on my computer and my son asked me, ‘Why have you got a record collection?’ His point was – why was I limiting myself to my collection of records, when he was using Spotify to access millions of songs? Why did I feel the need to own the music when I only wanted to listen to it?
“I was using modern technology to convert my LP collection, but I was doing so with a traditional mindset. I was only using 1% of the capacity of technology.”
Domeney then switched focus to an initiative within his team at JLR – where a graduate engineer (‘who knew nothing about making engines!’) proposed that they launch a project to ‘exploit the unknown’.
“He came to me and said ‘I’ve got a really good idea: let’s connect all of our databases and find problems that we don’t know exist’. He started linking together seven of JLR’s databases, and we found hundreds of ‘unknowns’ – that is, data correlations previously hidden to the company,” explains Domeney.
FICO solutions were at the heart of the JLR team’s search for ‘unknown uknowns’ – helping to uncover hidden correlations and causations within the business, and pointing the way to significant financial savings.
One correlation uncovered helped identify a faulty accumulator in a set of 96 machines making cylinder blocks. As a group their performance was in the normal range, but closer analysis of energy consumption vs productivity suggested there were outliers of ‘unusual transactions’ – and the team discovered that an accumulator was working inefficiently drawing excess energy.
Another correlation revealed that JLR had more staff late to work when Wolverhampton, the local football team, was playing: “It’s not rocket science but the marginal gains we gain from texting people to warn them of congestion on the roads because there is a football match are critical.”
One example where correlation did not equal causation was the link between plant efficiency and what food was on the canteen menu: workers underperformed when Cumberland sausages were on the menu, yet overperformed when there was fish and chips. Was this the food being served – or was it when it was being served, ie., the ‘Friday Factor’?
In the above instances the answers were readily available, but only once you asked the right questions. Domeney believed that the next step to enable better interrogation is to deploy connectivity and sensors ‘to the edge’ – at the level of sub-components. As this would generate far too much data to send to the cloud, he suggested reviewing data only when there is an anomaly – something which admittedly required a different approach to designing machines.
“Ultimately businesses fail because their implementation is wrong, not because of strategy,” he summed up by way of conclusion. “If organisations do not value data, then your data analysis will always fail. The language of ‘end of shift reports’ points to how backwards-looking we are when it comes to our data analysis. We need to be more agile, asking ourselves, ‘What is our predicted performance tomorrow?’”
FICO’s Muratore stated that embarking on a radical transformation project can be daunting, even for companies as established as JLR. To create the future, you have to do things that haven’t been done before. It was encouraging to have feedback from the JLR team that we are a trusted partner that has the ability, technology, analytic expertise and flexibility to think differently about the problem needing to be solved.
Senior automotive manufacturing executives – including representatives from Nissan, JCB, Unipart and Yamazaki Mazak, in addition to JLR’s Domeney – went on to highlight the role that the Industrial Internet could play in eking out incremental performance improvement in their operations.
The 16 executives present agreed that manufacturing’s fundamentals were broadly unchanged over the past couple of decades, and that more attention needed to be focused on manufacturing process innovation, and not just product innovation. However, data collection from manufacturing equipment, down to the level of sub-components, created both opportunity and challenge.
The opportunity centred around the ability to build more accurate physics models, and drive manufacturing optimisation through incremental improvements. There was agreement that a global best practice performance benchmark of 85% efficiency still left a lot of room for value creation – as fractional improvements delivered compounded margin gains.
The challenge was identified as being whether data collected from connected manufacturing equipment was fit for purpose. As an industry, manufacturing creates – and then discards – more data than any other industry. If data is not valued by a company, it will not be able to take advantage of the step change in performance promised by 4IR (fourth industrial revolution).
Big Data vs Better Data?
There were differing opinions on whether the right approach was to focus on managing less data better, or whether the advent of lower-cost computing resources would enable ever more efficient analysis of companies’ data lakes.
Likewise there was disagreement whether speed of analysis was something to be traded for comprehensiveness of the data set being analysed. It was reckoned that for every 10% increase in the volume of data being analysed, results improved by 3%.
What was accepted around the table was that the Internet of Things (IoT) had created a unique opportunity to ‘embed intelligence at the edge’ – in other words, to first use cheap sensors to collect, and then secondly use distributed computing power to prioritise which data was passed on for higher-level analysis.
The language of credit card transaction monitoring was used by one manufacturer to highlight the fresh mindset required to pool data and then look for anomalous transactions in order to understand what was happening in your operations in real time, and then build probabilistic models for what could happen in future.
In summary – while there was apprehension about how best manufacturers should manage rising data volumes, there was still an appetite for achieving a more granular view of operations. Whether this came from a more collaborative approach to data-sharing across the supply chain, or a deliberate search for ‘unknown unknowns’ – the desire for faster, better decision cycles was the underlying theme of the evening.