Chuyển đến nội dung chính

How big data can improve manufacturing

In the past 20 years or so, manufacturers have been able to reduce waste and variability in their production processes and dramatically improve product quality and yield (the amount of output per unit of input) by implementing lean and Six Sigma programs. However, in certain processing environments—pharmaceuticals, chemicals, and mining, for instance—extreme swings in variability are a fact of life, sometimes even after lean techniques have been applied. Given the sheer number and complexity of production activities that influence yield in these and other industries, manufacturers need a more granular approach to diagnosing and correcting process flaws. Advanced analytics provides just such an approach.

Advanced analytics refers to the application of statistics and other mathematical tools to business data in order to assess and improve practices (exhibit). In manufacturing, operations managers can use advanced analytics to take a deep dive into historical process data, identify patterns and relationships among discrete process steps and inputs, and then optimize the factors that prove to have the greatest effect on yield. Many global manufacturers in a range of industries and geographies now have an abundance of real-time shop-floor data and the capability to conduct such sophisticated statistical assessments. They are taking previously isolated data sets, aggregating them, and analyzing them to reveal important insights.


Consider the production of biopharmaceuticals, a category of healthcare products that includes vaccines, hormones, and blood components. They are manufactured using live, genetically engineered cells, and production teams must often monitor more than 200 variables within the production flow to ensure the purity of the ingredients as well as the substances being made. Two batches of a particular substance, produced using an identical process, can still exhibit a variation in yield of between 50 and 100 percent. This huge unexplained variability can create issues with capacity and product quality and can draw increased regulatory scrutiny.

One top-five biopharmaceuticals maker used advanced analytics to significantly increase its yield in vaccine production while incurring no additional capital expenditures. The company segmented its entire process into clusters of closely related production activities; for each cluster, it took far-flung data about process steps and the materials used and gathered them in a central database.

A project team then applied various forms of statistical analysis to the data to determine interdependencies among the different process parameters (upstream and downstream) and their impact on yield. Nine parameters proved to be most influential, especially time to inoculate cells and conductivity measures associated with one of the chromatography steps. The manufacturer made targeted process changes to account for these nine parameters and was able to increase its vaccine yield by more than 50 percent—worth between $5 million and $10 million in yearly savings for a single substance, one of hundreds it produces.

Developing unexpected insights

Even within manufacturing operations that are considered best in class, the use of advanced analytics may reveal further opportunities to increase yield. This was the case at one established European maker of functional and specialty chemicals for a number of industries, including paper, detergents, and metalworking. It boasted a strong history of process improvements since the 1960s, and its average yield was consistently higher than industry benchmarks. In fact, staffers were skeptical that there was much room for improvement. “This is the plant that everybody uses as a reference,” one engineer pointed out.

However, several unexpected insights emerged when the company used neural-network techniques (a form of advanced analytics based on the way the human brain processes information) to measure and compare the relative impact of different production inputs on yield. Among the factors it examined were coolant pressures, temperatures, quantity, and carbon dioxide flow. The analysis revealed a number of previously unseen sensitivities—for instance, levels of variability in carbon dioxide flow prompted significant reductions in yield. By resetting its parameters accordingly, the chemical company was able to reduce its waste of raw materials by 20 percent and its energy costs by around 15 percent, thereby improving overall yield. It is now implementing advanced process controls to complement its basic systems and steer production automatically.

Meanwhile, a precious-metals mine was able to increase its yield and profitability by rigorously assessing production data that were less than complete. The mine was going through a period in which the grade of its ore was declining; one of the only ways it could maintain production levels was to try to speed up or otherwise optimize its extraction and refining processes. The recovery of precious metals from ore is incredibly complex, typically involving between 10 and 15 variables and more than 15 pieces of machinery; extraction treatments may include cyanidation, oxidation, grinding, and leaching.

The production and process data that the operations team at the mine were working with were extremely fragmented, so the first step for the analytics team was to clean it up, using mathematical approaches to reconcile inconsistencies and account for information gaps. The team then examined the data on a number of process parameters—reagents, flow rates, density, and so on—before recognizing that variability in levels of dissolved oxygen (a key parameter in the leaching process) seemed to have the biggest impact on yield. Specifically, the team spotted fluctuations in oxygen concentration, which indicated that there were challenges in process control. The analysis also showed that the best demonstrated performance at the mine occurred on days in which oxygen levels were highest.

As a result of these findings, the mine made minor changes to its leach-recovery processes and increased its average yield by 3.7 percent within three months—a significant gain in a period during which ore grade had declined by some 20 percent. The increase in yield translated into a sustainable $10 million to $20 million annual profit impact for the mine, without it having to make additional capital investments or implement major change initiatives.

Capitalizing on big data

The critical first step for manufacturers that want to use advanced analytics to improve yield is to consider how much data the company has at its disposal. Most companies collect vast troves of process data but typically use them only for tracking purposes, not as a basis for improving operations. For these players, the challenge is to invest in the systems and skill sets that will allow them to optimize their use of existing process information—for instance, centralizing or indexing data from multiple sources so they can be analyzed more easily and hiring data analysts who are trained in spotting patterns and drawing actionable insights from information.

Some companies, particularly those with months- and sometimes years-long production cycles, have too little data to be statistically meaningful when put under an analyst’s lens. The challenge for senior leaders at these companies will be taking a long-term focus and investing in systems and practices to collect more data. They can invest incrementally—for instance, gathering information about one particularly important or particularly complex process step within the larger chain of activities, and then applying sophisticated analysis to that part of the process.

The big data era has only just emerged, but the practice of advanced analytics is grounded in years of mathematical research and scientific application. It can be a critical tool for realizing improvements in yield, particularly in any manufacturing environment in which process complexity, process variability, and capacity restraints are present. Indeed, companies that successfully build up their capabilities in conducting quantitative assessments can set themselves far apart from competitors.

About the authors

Eric Auschitzky is a consultant in McKinsey’s Lyon office, Markus Hammer is a senior expert in the Lisbon office, and Agesan Rajagopaul is an associate principal in the Johannesburg office.

The authors would like to thank Stewart Goodman, Jean-Baptiste Pelletier, Paul Rutten, Alberto Santagostino, Christoph Schmitz, and Ken Somers for their contributions to this article.

Nhận xét

Bài đăng phổ biến từ blog này

Các nguyên tắc của COBIT 5

Nguyên tắc thứ 1: Đáp ứng nhu cầu các bên liên quan (Meeting stakeholder needs) Doanh nghiệp tạo ra giá trị cho các bên liên quan bằng việc duy trì cân bằng giữa lợi ích, rủi ro và nguồn lực.  COBIT 5 cung cấp các quy trình cần thiết và các điều kiện cần thiết (enabler) nhằm hỗ trợ việc tạo ra các giá trị kinh doanh thông qua việc sử dụng công nghệ thông tin. Mỗi doanh nghiệp khác nhau sẽ có các mục tiêu khác nhau nên một doanh nghiệp có thể tùy biến COBIT 5 để phù hợp với bối cảnh của doanh nghiệp thông qua mục tiêu kinh doanh, biến đổi từ mục tiêu kinh doanh chung thành các mục tiêu chi tiết mà có thể quản lý được, có các đặc tả chi tiết và ánh xạ các mục tiêu đó vào các quy trình, các thực hành của mục tiêu CNTT. Các tầng mục tiêu (goals cascade) đạt được thông qua bốn bước: Bước 1: Định hướng của các bên liên quan ảnh hưởng đến nhu cầu của các bên liên quan. Bước 2: Nhu cầu của các bên liên quan tác động vào mục tiêu của doanh nghiệp.   Nhu cầu của các bên liên

Quản trị công nghệ thông tin

"Theo định nghĩa của OCED, quản trị doanh nghiệp (corporate governance) bao gồm các quy trình để định hướng, kiểm soát và lãnh đạo tổ chức. Quản trị doanh nghiệp bao gồm thẩm quyền, trách nhiệm, quản lý, lãnh đạo và kiểm soát trong tổ chức." Theo Principles of Corporate Governance,  OCED. "Quản trị công nghệ thông tin (IT Governance - ITG) là trách nhiệm của Ban Giám Đốc và các nhà quản lý. Quản trị công nghệ thông tin là một phần của quản trị doanh nghiệp và bao gồm cấu trúc lãnh đạo, cấu trúc tổ chức và các quy trình để đảm bảo công nghệ thông tin của tổ chức được duy trì và mở rộng theo các định hướng chiến lược và mục tiêu của tổ chức'' Theo Board Briefing on IT Governance, 2 nd  Edition,  IT Governance Institute Thông tin là một nguồn lực quan trọng của tất cả các doanh nghiệp và công nghệ giữ một vai trò cũng quan trọng từ khi thông tin được hình thành đến khi thông tin bị phá hủy.  Công nghệ thông tin ngày càng phát triển và trở nên phổ biến hơn

MỤC 2.1: TẠO GIÁ TRỊ (CREATING VALUE)

Các dự án tồn tại trong một hệ thống lớn hơn, chẳng hạn như một cơ quan chính phủ, tổ chức hoặc thỏa thuận hợp đồng. Để ngắn gọn, tiêu chuẩn này sử dụng thuật ngữ tổ chức (organization) khi đề cập đến các cơ quan chính phủ, doanh nghiệp, các thỏa thuận hợp đồng, liên doanh và các thỏa thuận khác. Tổ chức tạo ra giá trị cho các bên liên quan. Ví dụ về các cách mà các dự án tạo ra giá trị bao gồm, nhưng không giới hạn ở: Tạo sản phẩm, dịch vụ hoặc kết quả mới đáp ứng nhu cầu của khách hàng hoặc người dùng cuối; Tạo ra những đóng góp tích cực cho xã hội hoặc môi trường; Cải thiện hiệu quả, năng suất, hiệu quả hoặc khả năng đáp ứng; Thực hiện các thay đổi cần thiết để tạo điều kiện thuận lợi cho việc chuyển đổi tổ chức sang trạng thái mong muốn trong tương lai; và Duy trì các lợi ích được kích hoạt bởi các chương trình, dự án hoặc hoạt động kinh doanh trước đó. 2.1.1 CÁC THÀNH PHẦN CUNG CẤP GIÁ TRỊ (VALUE DELIVERY) Có nhiều thành phần khác nhau, chẳng hạn như danh mục đầu tư, chương trình,