by Dan Somers
Initially, manufacturing came from the Industrial Revolution, i.e. “1.0.” This was represented by a focus on the plant and the product. Quality and after-service were not the focus. In the mid-20th Century, Manufacturing 2.0 started to be driven by improving processes through tools such as ‘”Lean,” Total Quality Management. and Six Sigma.
Manufacturing 3.0 describes the convergence of data right across the business and its centralisation to all aspects of the modern manufacturing business: design, innovation, improving quality. and reducing cost and environmental impact across the supply chain, internally and the after-service i.e. the warranty, maintenance and through-life of the product. No longer does data sit in a silo for a specific task.
Manufacturing stores more data than any other sector – close to 2 exabytes of new data was stored in 2010 (McKinsey 2011). Gartner backs this up, saying more than 23 percent of manufacturing firms have already invested in technology for big data and 27 percent plan to invest in technology for big data during the next two years.
Big Data in the world of manufacturing is drawn from a multitude of sources: from instrumented production machinery (process control), to supply chain management systems, to systems that monitor the performance of products that have already been sold.
McKinsey suggests that manufacturers could make up to a 50 percent decrease in product development, assembly costs and up to a 7 percent reduction in working capital through the use of Big Data.
Manufacturers are beginning to combine data from different IT systems such as computer-aided design, computer-aided engineering, computer-aided manufacturing, collaborative product development management, and digital manufacturing – some of the most powerful impacts of big data will come from the sharing of data across entire manufacturing lifecycles and supply chains.
Is this just a Utopia? No, although there are varying levels of maturity between sectors and even between companies in the same sector, there are indeed some pioneering companies and applications of big data already in existence in manufacturing industries.
Toyota, Fiat, and Nissan have all cut new-model development time by 30 to 50 percent through the collaborative use of data and modelling techniques; Toyota claims to have eliminated 80 percent of defects prior to building the first physical prototype.1
But why do we need Manufacturing 4.0? Even leading companies with ostensibly optimised individual systems and utilising the most sophisticated quality techniques may still be losing up to 30% of their product sales value through the Cost of Poor Quality (“COPQ”), a figure substantially more than their profits. We had reached a glass ceiling when it comes to optimising individual processes and now we must harness the power within the big data to optimise holistically across the silos to take us to Manufacturing 4.0. This may be easier said than done, but there is a new corporate goal and vision.
Imagine getting it right the first time every time, significantly improved quality, reduced warranty costs, reduced environmental footprint, and no product recalls that could potentially damage a brand or worse risk customer safety. These are all things that we have been striving for through the introduction of Lean Manufacturing and Six Sigma processes. Now technology is arriving that will truly deliver zero tolerance to failures.
So what is holding us back? Well, the top 5 recorded big data challenges for manufacturers are: the level of trust between a data manager and a production manager, determining which data to use for which business decision, being able to handle the level and complexity of data available, getting different departments and functions to share data, and finding an optimal way to organise big data activities.2
I’m involved with an exciting new piece of software called SigmaGuardian by Warwick Analytics – a perfect example of a technology that is enabling a data-driven culture of manufacturing. The software automatically locates the root causes of faults, and recommends the most beneficial actions without the need for hypotheses. It can work with incomplete and inaccurate data, and even resolve No Fault Found problems.
The algorithms within the software were developed following a decade of academic research in the UK and automatically analyse all and any data available from all of the process and testing equipment, MES, ERP and any other siloed data. The software then identifies the root cause of the faults and provides recommendations based on the most economical fix. Because it is based on information theory and is ”non-statistical,” it can do this without any hypotheses and even if there is missing or dirty data. It has been proven to work even for No Fault Found issues. It complements and integrates all major vendors of process control equipment and can run quietly in the background, checking and recommending root-cause fixes along the way.
The technology has already been used to solve issues at global firms including Motorola (provenance of Six Sigma) and was recently named a DEMO God at DEMO Fall 2013. These are exciting times for both Warwick Analytics and manufacturing.
Plant Managers and Engineers should be keeping a clear look out for technological advancements that could benefit their company and its customers. By adopting the right applications at the right time, firms will gain significant competitive advantages over their competitors and truly become Factories of the Future.
1 Big data: The next frontier for innovation, competition, and productivity, The McKinsey Global Institute, 2011
2 Manufacturing: Big Data Benefits and Challenges, Tata Consultancy Services, 2013