51
Humans, Technology, and Supply Chains Jui Han Lin 2021

Humans, Technology, and Supply Chains

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Humans, Technology, and Supply Chains

Jui Han Lin 2021

2

Introduction In my previous experience, I have witnessed how supply chain management with the integration of technology can empower smarter decision-making for enhanced resilience and labor well-being. Today, businesses seek more integrated supply chains covering diverse teams and people. Technologies such as big data analysis, AI, IoT, 3D printing, and so on have the potential to connect people, illuminate problems, and even come up with solutions. This report aims to explore how we can use various technologies to transform, improve, and strategize business towards success.

3

Table of Contents Leaning on Data 1. Parametric Optimization for Screen Print Stencil Production …………………………...….. 2

2. Raw Material Input Suggestion for the VOD Process in Stainless Steelmaking ...... 15

Rethinking Processes 3. Material Preparation Transparency ……………………………………………………………………… 24

4. Material QR Code Pre-Inspection ………………………………………………………………………… 31

Forecasting the Future 5. SCM as a Revenue-Gaining Competitive Advantage ...................................................... 38

6. The Factory Next Door ............................................................................................................. 43

Leaning on Data In this series, I explore topics outside of work in which I apply my supply chain knowledge and analytical skills to solve problems in new fields, and occasionally discover some managerial wisdom that can be leveraged across domains.

2

1. Parametric Optimization for Screen Print Stencil Production

Photo from Printful

3

As part of an AI case competition I am currently participating in, my team faced a challenge to find the most appropriate production parameters for a 40-year old screen print stencil manufacturer in Taiwan. Here is my brainstorming process and proposed solution:

1. Abstract

The stencil manufacturer (hereafter referred to as “LW”), hoped to research the optimal parameters for producing stencils with a focus on how each fabric property (including fabric type, weight, composition, etc.) uniquely contributes to the selection. Our team proposed using “Bayesian optimization,” a mathematical method proven effective across a wide range of industries, to identify the optimal parameter set through just a few experiments.

2. Introduction

Stencil making is a key process within the screen printing supply chain, and a deterministic factor in the final print quality. The craft originated in China around 960–1279 AD and has matured over time, developing into an industry where tight-knit supply-customer relationships abound, rooted in tacit compliance with one other’s customs and restrictions. As a result, today’s stencil manufacturers like LW, rely predominantly on experienced workers, who are familiar with the industry’s “best practices,” to make crucial production decisions, although their know-how has never been scientifically validated. But the fact that so many factors collectively influence the product quality (Image 1) only shows how burdensome it is for humans alone to manage every detail with perfect accuracy. Hence, many opportunities for improvement remain unrecognized or untackled.

4

Image 1[1]: Factors that influence the screen printing quality

The question LW hopes to answer is how different properties of fabric can inform the selection of the most suitable mesh counts to use for the screen. This problem reflects the pain points of using the conventional trial-and-error approach to test parameters, especially in the face of never-before-seen fabric types — not only does it demand back-and-forth negotiation with the up-and downstream partners, but can trigger a costly rework if the wrong parameter is chosen. The cost of a rework is roughly estimated at $375 ($75 per screen x 5 colors; equivalent to 1/3 of the average monthly salary in Taiwan) and 5 extra workdays.

3. Tailoring to Customer Needs

3-1. Problem Analysis

First, we broke down LW’s problem into two parts: fabric properties (supposed “cause”) and mesh counts (supposed “effect”). In screen printing, fabric properties determine the level of ink absorbency, surface smoothness, and other textile characteristics that lead to distinctive print results; mesh counts affect the amount of ink that can flow through the mesh openings, contributing to the visual effects of the end product, such as the resolution level and evenness of color.

5

Image 2: Screens with lower mesh counts work well for larger spot color designs and allow larger particle inks to pass through; screens with higher mesh counts are more suitable for images that use finer detail.

Here, we discovered that both the fabric properties and the mesh counts play a role in deciding the end product’s quality. That is, rather than solving the problem using the “cause” and “effect” originally implied, we should instead set the “effect” to be the quality of prints, and the “causes” to be the two factors. However, we found that LW currently uses a combination of experience and visual inspection for quality checks, rather than an objective set of measurements that we could run analysis on. That meant we would have to spend time with the company later on to define and implement new standards. But at the moment, prior to rushing into any solution, we had the problem’s inputs and output clarified.

Next, we mapped the problem back to Image 1. What we found was that many other factors, besides fabric properties and mesh counts, also influence print quality (Image 3). We suspected that LW left out the other factors (that it also has control over) in the inquiry because adjusting those factors were assumed to be inconsequential or trivial to quality. Yet, considering that new standards would be put in place without any historic data measured upon such standards, what would most benefit LW was a thorough review of all of the potential parameters. Blindly trusting a narrowly framed problem can preclude us from opportunities to surpass the client’s past performance. Only by comprehensively evaluating all related parameters could we prevent ourselves from falling into the local optimization trap, and lead us to a result LW can depend on for a long time.

6

Image 3: Fishbone diagram with mesh count and fabric properties (material) highlighted in yellow. Parameters that LW can adjust are highlighted in red. Other evaluation-worthy parameters are pending

further confirmation with LW.

After realizing we could better serve our client’s needs by re-framing the problem, I immediately brought the idea to LW’s leadership team and got their approval to carry forward with my plan. LW’s representative was impressed with our team’s prudent attitude and expressed his strong willingness to support us. The experience shows how sincere and active stakeholder engagement can go a long way toward cultivating trust and commitment for change.

3-2. Idea Formulation

LW currently has no usable data for analysis. However, collecting a big chunk of data from square one can be costly and risky. Human judgment is already performing at a decent success rate. So, the company may have a limited appetite for large investments in data collection to further improve its operations, and we now faced the challenge of finding an efficient analytical approach.

Past research on parametric optimization for stencil production, including [2–4], mainly used methods such as Taguchi and DOE. Such methodologies require doing a large number of trials to cover all possible parameter combinations, with each trial being fully independent

7

of the others. I learned that an alternative method called Bayesian optimization, which utilizes the strategy of banking insights from previous trials to inform the next trial, could help our team find the optimum using very few trials. The method, which has regained recognition in the past few years thanks to rising computing power, has been proven effective with regard to speeding up the search for global optimums for various processes across disciplines, including hyperparameter tuning for deep-learning models [5], grinding machines [6], and robotics [7]. I believed Bayesian optimization was also an optimal choice for LW’s cost-sensitive manufacturing environment. Moreover, since the method had yet to be applied in the field of screen printing, if it succeeded, it would provide the company with a competitive edge over other manufacturers and a brand-new way of thinking about processes. Finally, the method would serve as a good entry point if LW ever decided to introduce more data-demanding models to their operations.

4. Theories

4-1. Bayesian Optimization

Bayesian optimization combines the concepts of Gaussian process regression and Bayesian inference, using the former to build a surrogate model for the unknown function and the latter to update the model when a new set of observed data becomes available. Then, calculating the acquisition function of the surrogate model will recommend the parameter sets to be used for the next experiment. These processes are repeated until the data points meet the convergence limit, at which time we can find the global optimum of the objective function (Image 4[8]).

8

Image 4: This image shows Bayesian optimization analysis over three iterations. The plot shows how the objective function in purple (unknown in practice) and the acquisition function in green updates as additional

experiments are done.

4-2. Bayesian Inference

Bayesian inference, based upon Bayes’ theorem, updates the probability for a hypothesis as more observed data becomes available. If we assume a linear function: 𝑦𝑦 = 𝑤𝑤𝑤𝑤 + 𝜖𝜖, the Bayesian approach specifies a prior distribution, p(w), on the parameter, w, we can update the prior one using Baye’s rules [9]:

4-3. Gaussian Process Regression (GPR)

A Gaussian process is a collection of random variables, any finite number of which have consistent Gaussian distributions [10]. Any Gaussian process can be uniquely defined by a mean and a covariance (also referred to as a kernel) function. Gaussian process regression follows the idea of Bayesian inference to incorporate prior knowledge with newly observed data to find the distribution over predictions (Image 5[11]).

9

Image 5: Gaussian processes regression

4-4. Acquisition Function

Acquisition functions are mathematical techniques that guide how the parameter space should be explored during Bayesian optimization, through balancing the exploration and exploitation effects. In this project, the Expected Improvement (EI) acquisition function will be used:

The goal of the EI acquisition function is to find the g(x) that is expected to improve upon the known maximum g(x+) the most [12].

4-5. K-Nearest Neighbors Algorithm (KNN)

KNN is a non-parametric method that classifies a new data point based on its respective similarities with the available data. Based on the k value assigned by the user, the algorithm finds the nearest k data points with the least distance in the feature space from our new data point. Then the nearest neighbors essentially “vote” for which class the new data point should belong to. For instance, in Image 6, with a k of 3, the new data point will be categorized into the class of orange circles.

10

Image 6: KNN algorithm

KNN has been proven effective on smaller-scale datasets. In LW’s case, we recommend that KNN be implemented when the database has grown to a certain extent so that the classification result can be explainable.

11

5. Methodology

Image 7: Planned process flow

The proposed solution includes the following processes (Image 7):

1. Determine measurable objectives(s): Guide LW to define a scalable and easily applicable set of standards to measure the end product’s quality. Potential indices include surface roughness (Rz), print thickness (EOM), fine line resolution (with microscopic examination)…and so on. Quantitative measurements will ensure the dataset truthfully reflects the print quality and a “garbage in, garbage out” free analysis.

2. Manually select influential parameters: LW has accumulated nearly four decades of production know-how, which should not be neglected, but rather, incorporated into the research process. Therefore, we will count on experienced workers to filter out the negligible parameters before an overall, in-depth analysis of the meaningful ones. Doing so will prevent duplication of work and avoid heavy computational loads.

12

3. Do Bayesian optimization analysis: Taking into account that LW lacks historic data for analysis and that naively collecting large amounts of data from zero might be overly risky for the company, we have suggested using Bayesian optimization to efficiently pinpoint the global optimum and provide a quick payout on investment. Here, we will follow the steps as described in Section 4, “Theories”.

4. Do KNN analysis: Once a meaningful amount of data has been collected, we can try to implement KNN to dive deeper into the relationships between the different parameters that cannot be easily identified through human judgment.

6. Foreseeable Risks

• Optimized parameters are inapplicable in practice: If the recommended set of parameters cannot be realized, (for instance: no screen exists for a particular mesh count and mesh thickness combination), we should follow a descending order to find the lesser optimized parameter sets for implementation.

• No significant cost improvement by using Bayesian optimization over the traditional trial-and-error approach: Theoretically, due to the diversity of each parameter, the proposed method should offer a significant cost advantage over traditional methods. But practical constraints may exist to restrain the diversity of variables, and thus limit the cost savings that Bayesian optimization can bring. If we do not see significantly more benefit from applying the Bayesian approach, I suggest skipping Bayesian optimization and going straight to KNN analysis once sufficient data has been gathered.

• Significant computation cost: Bayesian optimization requires significant computing power, although we expect the computation cost to fall within a reasonable range as many parameters will have been manually filtered out. Otherwise, we suggest combining Bayesian optimization with Evolutionary Algorithms to accelerate the calculation rate [13].

7. Expected Outcomes

7-1. Goals (over the 3-month project period)

• Complete research on 3 types of fabric. • Achieve a minimum of 95% accuracy on parameter selection, compared with human

judgment. • Successfully implement database and database UI for the client’s operations.

13

7-2. Additional Benefits

• Minimized parametric optimization cost: Reduce the number of experiments by 50% while fulfilling the quality requirements.

• Seamless digital transformation: Set up essential software/hardware structures for transitioning to a data-driven culture.

• Enhanced supplier-customer collaboration: Provide a way with low barriers to entry for different parties in the industry to join forces to test out innovative ideas or examine current processes.

Works Cited

[1] Pan, J., Tonkay, G. L., & Qunitero, A. (1999). Screen Printing Process Design of Experiments for Fine Line Printing of Thick Film Ceramic Substrates. Journal of Electronics Manufacturing, 09(03), 203–213.

[2] Yen, Y. T., Fang, T. H., & Lin, Y. C. (2011). Optimization of screen-printing parameters of SN9000 ink for pinholes using Taguchi method in chip on film packaging. Robotics and Computer-Integrated Manufacturing, 27(3), 531–537.

[3] Chiu, C. (2016). The Study of Parameter Optimization for Screen Printing using Consistent Fuzzy Preference Relations and Taguchi Methods.

[4] Cazac, V., Cîrja, J., Balan, E., & Mohora, C. (2018). The study of the screen printing quality depending on the surface to be printed. MATEC Web of Conferences, 178, 03015.

[5] Victoria, A. H., & Maragatham, G. (2020). Automatic tuning of hyperparameters using Bayesian optimization. Evolving Systems, 12(1), 217–223.

[6] Maier, M., Rupenyan, A., Bobst, C., & Wegener, K. (2020). Self-optimizing grinding machines using Gaussian process models and constrained Bayesian optimization. The International Journal of Advanced Manufacturing Technology, 108(1–2), 539–552.

[7] D. J. Lizotte, T. Wang, M. H. Bowling, and D. Schuurmans (2007), Automatic gait optimization with gaussian process regression, IJCAI, vol. 7, pp. 944–949.

[8] Shahriari, B., Swersky, K., Wang, Z., Adams, R. P., & de Freitas, N. (2016). Taking the Human Out of the Loop: A Review of Bayesian Optimization. Proceedings of the IEEE, 104(1), 148–175.

14

[9] Sit, H. (2020b, June 12). Quick Start to Gaussian Process Regression — Towards Data Science. Medium. https://towardsdatascience.com/quick-start-to-gaussian-process-regression-36d838810319

[10] Shi, Y. (2020, October 31). Gaussian Processes, not quite for dummies. The Gradient. https://thegradient.pub/gaussian-process-not-quite-for-dummies/

[11] Gaussian Processes regression: basic introductory example — scikit-learn 0.17.1 documentation.(2014). Scikit-Learn. https://scikit-learn.org/0.17/auto_examples/gaussian_process/plot_gp_regression.html

[12] Krasser, M. (2018, February 21). Bayesian optimization. Martin Krasser’s Blog. http://krasserm.github.io/2018/03/21/bayesian-optimization/

[13] Lan, G., Tomczak, J. M., Roijers, D. M., & Eiben, A. E. (2020). Time Efficiency in Optimization with a Bayesian-Evolutionary Algorithm. https://arxiv.org/pdf/2005.04166.pdf

15

2. Raw Material Input Suggestion for the VOD Process in Stainless Steelmaking

Photo from Reuters

16

I am currently leading a team of three to design an AI solution for the largest stainless steel manufacturer (hereafter referred to as “YL” or “the client”) in Taiwan and the second largest in Asia. This project is a rare opportunity for me to polish my data analysis skills in an industry that is completely new to me. While the project is ongoing, I am recording my exploratory and brainstorming process for the best analysis approach here:

Stainless Steelmaking

Stainless steel is a centuries-old craft that has grown into a USD 119.3 billion industry. This capacity-driven and capital-intensive industry is dominated by a few big players. As of 2021, China holds over half of the global market share.

The manufacturing process involves a series of steps:

Stainless steel manufacturing process; Image from Walsin Lihwa

• Raw Material: Stainless steel is composed of iron (Fe), chromium (Cr), silicon (Si), nickel (Ni), carbon (C), nitrogen (N), and so on. The ratio of the raw material amounts determines the characteristic of the alloy and its designated “grade”.

• EAF (Electric Arc Furnace): The EAF is used to melt solid iron scraps into liquid form at over 1800 degrees Celsius.

• LHF (Ladle Heating Furnace): The LHF heats the steel to allow adjustments to the molten metal’s chemical composition.

• MRP (Metal Refining Process): The MRP carries out decarburization using oxygen and inert gases.

• VOD (Vacuum Oxygen Decarburization): The VOD process is used to lower the carbon content without compromising the Cr yield, by inducing argon from the

17

bottom and oxygen from the top. The alternative method is the AOD method, which uses argon rather than pressure adjustment to refine the molten steel.

• CC (Continuous Casting): CC is the process of infusing melted steel in a certain mold form.

We can break down the cost of manufacturing into roughly two parts: raw materials and fabrication. The former accounts for the majority of the production costs (around 70–80% according to this research, excluding the procedure to get the steel to its final form). Of course, the post-pandemic raw material cost spike only underscores the urgency to make the most efficient use of every resource.

The Problem

Based on the above, it is evident why the client needs our help to search for the best material combination to input, with a focus on the VOD process, to reduce current production costs.

Studying the Data

We have been given production records for the past four years to do the analysis. Each record consists of 6,000–7,000 rows of data. Data includes:

• Production profile: date, shift, steel grade, furnace number… • Chemical composition (C, Mn, Si, Cr, Ni, Mo, Cu, Al…etc.): initial C %, stage(n) C %,

final C %, initial Mn %, stage(n) Mn %… • Additional components: kgs of (x) alloy added at stage(n), kgs of (y) alloy added at

stage(n), total amount of argon added, total amount of nitrogen added… • Mechanical properties: mixing time, initial temperature, stage (n) temperature,

oxygen blowing rate, vacuum degree, slag basicity…

From studying the datasets, we can observe some follow-up actions:

• There is a significant gap between column values. For instance, the “stage(n) C %” usually falls around 0.12–0.08, and the “stage(n) Cr %” usually falls fall around 10–15. Hence, the data needs to be normalized before being used for training.

• We need to clarify with YL’s engineers whether there are duplicate columns among the columns and whether all data is accurately recorded. Doing some exploratory analysis will also help to spot problems.

18

• There may be a demand for more detailed information down the road. For instance, we may need information on the amount of argon induced at each time interval, in addition to the “total amount of argon added”. Whether we can get additional data depends on the client’s willingness to share it, as well as the availability of such information.

Crafting the Best Approach

After taking a comprehensive view of the datasets, next up is selecting the best ANN model for analysis. I also considered multiple aspects of the problem to help me make a decision:

1. What solution will generate the most business value?

As mentioned before, steels are categorized into various “steel grades.” Some grades, such as the SAE 304 widely used in household and industrial applications, produce more sales than others. YL should have its top-selling products contribute the majority of their revenue or profit. Thus, it is reasonable to use that steel grade as the entry point for composition engineering.

Now, back up a little. Couldn’t we just build a model that incorporates data from all product types?

19

The answer is “Yes,” but the tradeoff would be a less precise resulting model because meshing all the data balances out the unique reactions of any specific grade. In an industry so mature and rigid, creating an overly generalized model probably won’t add much value to the client.

Therefore, creating multiple models, each targeted toward a specific steel grade, is the better way to go. The next step is to communicate with the client to determine the product type that is most critical to the business or holds the most potential for improvement, and explain to the stakeholder why we think the “less efficient” method is more appropriate for this case.

2. What constraints (or opportunities) does the domain knowledge imply?

The VOD process contains three stages: oxygen blowing, degassing, and reducing. Each stage has its respective input and output chemical compositions, and the output composition of one stage becomes the input composition of the next.

3 stages of the VOD process; Image from Thermo-Calc

Now, it may seem intuitive to create three models that run consecutively. But is that the right approach?

20

The answer is “No.” Building three models implies that there is no correlation between the stages and that only the chemical composition is passed down. However, from consulting with the domain experts, we found that the mechanical properties have a lasting impact even if we have crossed over to the next stage.

This key insight that will make all the difference in choosing the model design would have been too profound for laymen to realize without the help of experts. This example shows the importance of asking the right questions to those who understand the processes deeply and shine a light on their perspectives.

§ Key stakeholder: Engineers or domain experts

3. How will the analysis be applied to the client’s operations?

If we look at the problem proposed by the client, we might straightforwardly come up with a model that finds the relationship between input parameters and the final chemical composition. And if we go on to question, “What insights do such a model offer?” we will

21

find that a model with the final composition as the output parameters cannot directly tell us what input parameters should be used to achieve the minimum cost.

Instead, we can take the costs and/or quality associated with each component or process into account to create a function that calculates the total cost or the health index of the production, then use the backpropagation feature of neural network (or use methods besides ANN methods, such as the genetic algorithm) to search for an optimal. Then, the model output can be applied by the client out of the box without the need for additional interpretation.

Today, AI or machine learning is seen by many businesses as a magic trick to solve all kinds of problems. Such a misconception can often lead to ambiguous problem definitions. So, it is vital to guide the client, rather than become misguided by them, to find the true pain points in their organizations and tailor a solution accordingly.

§ Key stakeholder: Project manager

22

My Learnings

When solving problems with AI/ML, focus on the project ROI rather than the technology itself. Ask smart questions that can usher in a solution that creates value for the client along as many dimensions as possible. Some recommended checkpoints and considerations for your AI project can be found here:

• 4 Ways That You Can Prove ROI From AI • Solving AI’s ROI problem. It’s not that easy.

23

Rethinking Processes In this series, I reflect on projects I’ve led in my previous jobs, in the hope of reasoning out some principles or trains of thought for brainstorming, designing, and implementing improved processes for smarter supply chains.

24

3. Material Preparation Transparency

Photo by Philippe Roy

25

In this project, I applied data analysis to pinpoint previously undefinable material preparation process efficiencies that involved multiple functions in the factory plant. As I did so, I realized that finding a cohesive database to start with is a challenge in itself. Here is how I tackled the problem:

Background

Every morning from 8:30 until the final SKU hit the line, our factory office was like a war room with project managers bustling about, urgently phoning our counterparts to track down the status of materials. Part of supervising the iPhone’s rigorous product development included ensuring that all 200+ SKUs in the BOM were ready for assembly by the start of every dayshift. Even short delays could result in hundreds of assembly workers idling, and exorbitant costs to the factory.

However, the reality was never less brutal. On a typical day, we needed to trace down around 20 SKUs late to the line. The number doubled or even tripled as we got closer to the end of a build or project, when leftover inventory was scarce. Sometimes that meant hours of stagnated production.

Chasing down a part was fundamentally complicated. The SKU could be under any of the six main processes, each branching out into many subprocesses managed by different functions:

26

An SKU received from the supplier is first sorted in a dedicated warehouse, then sent to one of 13 IQC rooms for a quality check (which includes passing through an indefinite number of workstations, depending on customer requests). Later, the acceptable parts are stored in the warehouse and assigned for use. Finally, designated SKUs are passed to the kitting rooms for allocation across 3–5 production lines.

Given the large number of parties and processes, variability in special requests from the customer, and tight time constraints, the efficiency of the long supply chain was always considered unmanageable, and the root causes of delays could never be clearly identified nor resolved. As a result, the team was stuck in a vicious cycle of resolving rather than preventing delays on a case-by-case basis, as our daily, frantic telephone calls indicated.

The poorly-managed process also invited functions to misstate their UPH (units per hour), claiming a throughput rate below full capacity. With no evidence to prove them wrong, the

27

PM team found itself in a constant tug of war with the stakeholders whenever schedules fell behind.

I was inspired to devise a better solution to the black box operations.

Problem

I realized that if we could develop transparency in the material preparation process, we could then truly control operations. What we needed was a management framework to easily calculate efficiencies and spot bottlenecks to hold the lagging functions accountable.

Observation

Based on my experience applying analytics to abstract problems, I knew data analysis was the only solution; however, at that time we lacked an integrated database to track efficiency. For large-scale corporations such as ours, management power is often dispersed. Without unified management, individual functions can operate under performance goals that are misaligned with the values or overall good of a company and are unlikely to have a database built for that purpose.

During my frequent visits to the supply chain, I serendipitously uncovered an SAP database maintained specifically for security purposes. Our client cared immensely about confidentiality and thus asked that every key part in the form factor be scanned and traced throughout the production process, leaving a timestamp record for the entrance and departure of most SKUs to the various links in the supply chain. After reflecting on this, I realized the massive potential to re-interpret and re-imagine this data to creatively address a major operational issue.

28

Each yellow dot represents where an entrance or a departure record can be generated as SKUs travel through the different functions. An SKU transferring from one location to another will add two new rows to the SAP database (ex: SKU#999 Warehouse#1 -10pcs; SKU#999 IQC#3 +10pcs). Correctly identifying the routes that each batch of material took in a database appended by time was a big challenge.

It is worth noting that to utilize the database, I had to assume that account transfers are equivalent to material movements. It is possible that some physical goods may be transferred to fulfill an emergent request in the morning, but virtual accounts are not updated until the afternoon as work slows down. I checked with the system’s users to see whether that could be the case and found that this special circumstance only accounted for a small proportion of cases whose impact could be ignored.

29

Idea Validation

In order to learn the definitions and use cases, I visited the database’s key users individually. Every function has its unique perspective and approach to the account recording procedure, and great patience was required to piece together a comprehensive picture. However, gaining an in-depth understanding of the real practices proved to be invaluable later on, as I was able to explain the distinctive data patterns generated by each party’s unique approach to the same scanning task (which would have been confusing had I not studied the initial user behaviors.) This experience stressed the importance of marrying domain knowledge to a pure numerical mindset.

Another communication challenge arose when I explained the project’s purpose. Skepticism inevitably surfaced over whether my project might create extra work or unwanted KPIs. I thus quickly learned to communicate the long-term benefit to the stakeholders, take care to not provoke negative feelings, and understand their individual concerns. Stepping out of the cubicles to interact with each person was essential to breaking down barriers. Until that point, tackling individual efficiencies was unprecedented at the company; however, after winning buy-in from all of the stakeholders, we had an opportunity for real change.

Data Analysis

Eventually, I was able to collect three years’ worth (36M) of records from the MRP software for analysis. Distilling information from the dataset represented the bulk of the work. The analysis consisted of the following steps:

1. Data cleansing: I excluded data beyond the relevant range (ex: material movements after entering the assembly line), and adjusted for process timeframes that covered holidays and weekends.

2. Pattern extraction: Because the database was designed to be appended by time, I used VBA to extract material movement patterns and restructure the data rows at the SKU level.

3. UPH calculation: The unit processing lead time could be calculated by using Excel to deduct the input time from the output time, then divide by batch size.

4. Data interpretation: Grouping and taking the mode of the data points by category helped reveal procedural efficiencies. (Taking the mode was the most suitable way for this scenario because it trims the extreme outliers in each category, which are

30

usually generated by special customer requests.) I also found the top performers for each category to use as baselines in the future.

Result

The results shed light on bottlenecks that the PM team’s previous intuition-based management methods had obscured. The analysis not only revealed the procedures that were falling short of expectations, but also areas in the supply chain where we could improve efficiency through minor alterations. For instance, the poor efficiency index of the Display IQC led us to investigate its operations and find that a simple upgrade to their old barcode scanners could increase the team’s overall efficiency two-fold. The need had been long overlooked because frontline workers had a limited voice. I immediately highlighted the need to the manager and resolved the issue within days. Ultimately, with these experiences as a template, we became more skilled at pinpointing urgent problems and prioritizing resources to address their root causes.

Tips for Tomorrow

• Be aware of how your company’s governing structure creates gaps that prevent the organization from achieving truly optimized supply chains. Check for cohesiveness between the goals of each department. See this interesting article on how no perfect organizational archetype exists, but how end-to-end coordination can help.

• Always remember to tie individual value to the organization’s value. Take some time out from the daily hustle to consider, “how can my role bring value to my company’s strategic goals?” For example, a project manager working for an OEM company should never lose sight of efficiency as the “ultimate goal” while making day-to-day decisions.

• Participate and observe frontline actions carefully. Understanding real practices informs better supply chain decision-making.

31

4. Material QR Code Pre-Inspection

Photo by Tony Law

32

In this project, I utilized a “hidden resource” in the organization — underutilized labor hours — to implement a new procedure to prevent production shutdowns by pre-inspecting materials’ validity, while keeping an eye on any additional costs the new practice might incur. Here is how I tackled the problem:

Background

In the previous article, I discussed how re-designing the management framework for the material preparation process helped the organization to prevent production disruptions that resulted from delays in materials reaching the assembly line. However, as project managers, our responsibility to control production shutdowns doesn’t end there. We continue to monitor the materials’ status as they flow through hundreds of workstations. Key components arrive at the factory with a QR code sticker on each piece. The code is scanned before the component gets assembled to the WIP. Scanned information is compared with the WO (work order) and BOM stored in the shop floor system. If the component specification and software information do not match, production will halt until further action is taken. Such errors can be further broken down into three types of situations, which account for the majority of production disruption cases:

Problems that emerge early on in the assembly line can easily be solved. In cases where problems do not appear until down the line, the solution may go beyond simply switching up SKUs or reissuing WOs. If no replaceable SKU can be found and the PO has to be redone, then all of the previous production data has to be wiped out and the WIPs have to flow back through tens of workstations to recollect information, which means hours or even days are wasted.

33

The PM team has implemented solutions in response to every situation type. Nonetheless, most practices are still susceptible to manual errors and fail to truly eliminate shutdowns.

Problem

By studying previous solutions, I realized that they all shared the problem of being executed too far from the action. While the solutions enforced repeated checks on the WOs and the components, many of them relied on second-hand sources (ex: checking the WO with MLB information provided by the vendor) that might or might not be accurate, leaving a huge opportunity for problems to occur. Also, no singular solution kept problematic materials from reaching the line.

34

We needed an alternative solution that allows checks on WO and component information as close to the assembly line as possible, one that posed a minimal risk of change and created a negligible impact on the system’s overall efficiency.

Idea Formulation

Implementing a new procedure for an existing operation can compromise the existing efficiency, and can thus trigger objections from stakeholders. By reviewing the kitting room’s material preparation schedule, I pinpointed the least-congested timeframe — after most materials are kitted and ready to be allocated for production — to set up the procedure. By then, the kitting room also has the WO and BOM information loaded to the shop floor software, allowing for a last-minute cross-validation between the sources.

Another thing to consider was how the inspection should be performed. After multiple consultations with the shop floor system managers, I realized that we could develop a new module in the existing shop floor software, for the kitting room to scan and check one piece of each SKU batch with its corresponding WO, similar to the concept of a virtual workstation.

35

This way, if an error was identified, the kitting room could immediately notify the PM team with sufficient buffer time for resolution before the start of the dayshift at 9 a.m. or by the time the WIPs flowed to that designated workstation (ex: usually in the afternoons for MLB installations).

Idea Execution

Engaging each stakeholder in every project development decision is key to overcoming their initial skepticism about the new idea, giving them a sense of ownership and a willingness to spare resources. For instance, when designing the module’s UI, I sought suggestions from the frontline workers so that once they were introduced to the final product, it looked appealing and familiar. When communicating, I tried to connect the project’s potential benefits with their pain points, so that they shared our sense of urgency.

At one point during the development phase, I faced the challenge of not seeing much progress in the new module because the software engineers were backlogged. I tirelessly communicated with the engineers to understand their individual concerns. I additionally leveraged my data management knowledge to precisely articulate my vision and needs. Eventually, we cleared all the obstacles to get the job done.

Also worth noting is when I first began trying to introduce the procedure, I negotiated with the kitting room managers, who were scared of extra work. I gained their support by offering them an automation tool I had programmed that could cut the required effort for one of their routine tasks in half. This experience embodies the idea that by leveraging the unique value we can bring to others, we can attain beyond-expected returns.

36

Result

Eventually, the pre-inspection workstation helped us reduce shutdowns by over 80%, which freed up time for employees to focus on more intellectually demanding, human-centered tasks. The idea was then used in five other projects, which helped the company saved $94,500 annually.

This project speaks to the value of continuous improvement in supply chains. Even when we think that all potential measures have been taken, there is always room for new and better ideas.

Tips for Tomorrow

• We can sometimes be tied up by suboptimal solutions for an extended period. Hence, we should frequently examine the solutions to determine whether they directly address the problem’s root cause and whether better options are available.

• An in-depth understanding of the process flow brings clarity for how to manipulate or adjust a portion(s) of the operations to achieve a better outcome. Drawing out the process flow helps.

• Gaining trust from stakeholders is essential to winning their support for your project (even if it benefits you more than it benefits them).

37

Forecasting the Future In this series, I explore how companies can use supply chain management as a sharp weapon to get ahead of the competition, and offer some plausible imaginations of tomorrow’s industrial world.

38

5. SCM as a Revenue-Gaining Competitive Advantage

Photo from www.reddit.com/user/soil_nerd

39

Within corporations, supply chain management is often positioned as a cost-saving vehicle for operations, rather than as a revenue-gaining competitive advantage. Although critical to the success of companies’ value delivery, strong supply chain management performance is often viewed as having little impact on the top-line growth of the organization (which is what investors and high-level executives pay the most attention to). Therefore, it is critical for supply chain practitioners to think about how we can utilize our supply chain knowledge to support the business’s core value and reshape conventional thinking about the field.

Business Value from Supply Chains

To find ways to tackle the problem, we first list the different business values that supply chains can generate:

Adapted from William P. King’s Digital Manufacturing and Design Innovation Institute presentation

By eliminating the initiatives that play a supporting role in product success or contribute mostly to cost as a competitive advantage, “reduce time to market” surfaces as the top way for corporations to use their supply chains to gain a competitive advantage. No book better explains how to accelerate time to market than George Stalk’s Competing Against Time (and the brief version), which was published in the 1990s and whose concept of “time-based competition” became a standard for many companies, including Apple, to steer their organizations toward success.

40

Accelerating Time to Market

According to Stalk, shortening time to market can provide companies with several competitive advantages beyond simply reducing costs:

• Broader product lines • Increased market breadth • Greater technological sophistication of products

Yet, most companies today are still stuck in the traditional planning loop, in which operational decisions are made based on forecasts for the distant future. The longer the process lead time is, the poorer the forecast. Hence, breaking the loop requires shortening the time needed for every step across the system. The author mentions three power sources for better time management:

Source 1: Production

Unlike traditional factories that specialize in either scale or variety, flexible factories are designed to incorporate the advantages of both. Variety can be achieved through shortening production runs as much as possible, which can be accomplished by reducing the complexity of processes to shave off changeover and setup time. Flexible factory practices and strategies differ from those of traditional factories on three dimensions:

From Chapter 2

Faster changeover makes smaller-batch production possible. Thus, companies can offer more diverse products, each with a reasonably low level of inventory. The product-centered layout is about collocating the different functionalities involved in the manufacturing process to compress material handling and moving time. Scheduling locally allows employees close to the shop floor to make on-the-spot production control decisions without having to pass suggestions to management for approval or go through sophisticated MRP systems.

Source 2: Sales & Distribution

41

It is just as crucial to compress the sales and distribution process as it is to address factory run time. The value delivery system usually includes multiple steps, as in the image below. Streamlining processes includes identifying each step’s capacity, organizational design, location, KPIs, and so on, to simplify the process.

From Chapter 2

Source 3: Product Development & Introduction

From observing companies that have succeeded in rolling out new products more quickly, the author notes that improvements in smaller increments at a higher frequency drive longer-term benefits. In addition, similar to the flexible factory setup, companies should design their incubators for new product development work around collocated, cross-functional teams.

From Chapter 4

The Cutting Edge

Looking back at the evolution of business strategies, we find that the best competitors are the ones that keep adapting and stay on the cutting edge. Each transition offers companies an opportunity to leave their rivals far behind. We know that time is today’s cutting edge, which raises the follow-up question: “What is the next cutting edge?”

This article by BCG suggests that competing on time will continue to be the leading principle behind company success for years to come. The only difference is that companies now have

42

more technological tools, including big data analysis, 3D printing, automation, and so on to help with further reducing cycle times.

Since everyone has data and many companies have already jumped on the bandwagon of digital transformation, I believe that those who ask the smartest questions and apply the technologies to solve them will be the ones to succeed, as data and highly applicable solutions are available to all. By finding truly innovative ways to think about a problem and create solutions, companies in the modern business environment can develop ways to lift themselves above the competition.

43

6. The Factory Next Door

Photo from Siemens AG

44

Richard D’Aveni’s book, The Pan-Industrial Revolution, describes an industrial realm that the author sees as the pinnacle of the aggregation of today’s rapidly advancing high-tech, namely robotics, AI, big data analytics, cloud computing, and IoT. At the core of the revolution is additive manufacturing (AM, aka 3D printing), which overcomes the material, product type, and purpose constraints of traditional manufacturing.

Before we dive deeper into the promises of the pan-industrials and how they will completely reshape the current economic landscape, I must admit that while alluring, the author’s prediction doesn’t seem too improbable. For starters, the concept of multiple technological progressions culminating to create possibilities far beyond what any independent player could bring is not uncommon in history books. Just look at the evolution of machine learning and AI. Moreover, aren’t we due for a big leap in supply chain management and a shift in our long-standing focus on optimization (i.e., designing the next last-mile strategy to save cost) to the exploration of a completely new alternative (i.e., building a brilliant factory right at your doorstep to produce all that you need)?

Align Technology, the manufacturer of Invisalign clear aligners, started adopting 3D printing to produce aligners for their 10.2 million patient base as early as 2009. By 2018, its production lines were able to output 320,000 unique products per day. The company is no doubt a 3D printing success story and a benchmark for many other companies.

Now, why are we talking about 3D printing when the traditional methods are serving just fine?

The reason is that additive manufacturing offers many advantages that traditional manufacturing cannot attain. For instance, additive manufacturing allows:

• Greater design complexity • One-step production • Lighter, stronger, simpler products • Multiple materials options • Inexpensive customization

Unlike traditional manufacturing, which specializes in economies of scale (producing identical things in large quantities), AM benefits from economies of scope, meaning companies can make almost anything, anywhere. Flexibility in production is typically thought

45

of as contrary to cost efficiency and leads us to question the validity of 3D printing as a business model beyond the maker community.

Are claims of 3D printing being “expensive” accurate?

Not really, because they have failed to account for the indirect cost impact AM has had on the supply chain and other business operations. For instance, if we no longer need complicated production processes that require gigantic facilities, we can build smarter, more compact factories closer to the actual market. Local factories can reduce shipping and inventory costs by locating themselves close to the demand. With the indirect impacts considered, additive manufacturing can likely match the cost-efficiency of traditional manufacturing.

Adapted from Chapter 2

The book also noted several other technology-specific opportunities that will further reduce the cost of AM, including:

• Development of new AM technologies • Increased printer size :  larger chamber, resin vet, and power bed

46

• Breakthrough in production quality : reduced post-processing time • Development of hybrid fabrication systems: combined benefits of multiple

technologies, such as using robotic arms to automatically feed materials to 3D printers

As AM and other high technologies are integrated, we can expect the emergence of so-called multimodal factories that can surpass geographic constraints to serve a variety of customer demands at affordable prices.

Then who will monitor the complex, dispersed factories behind the scenes?

Industrial platforms use data from internal and external sources to help make operational decisions, such as deciding how to adjust the supply chain when a vendor’s production plant based in California is affected by wildfires. The platforms will be supported by domain experts to make strategic decisions during special circumstances, but otherwise would be expected to make more comprehensive and ecosystem-wise optimized decisions than any human can.

Many companies, such as IBM, GE, and Jabil Inc., have already entered into the race, creating platforms that each come with a distinctive set of competitive advantages based on the company’s core competencies. The author further predicts that we will eventually arrive at an era of superconvergence where platforms fight with one another for users. More users will

47

create a stronger network effect that benefits both the platform owners and the users alike, much like what has occurred with Google and Amazon. By then, the market will likely be dominated by a few big players, each consisting of its own ecosystem, which thrives and competes on reputation, speed-and-flexibility, innovation, and deep pockets.

In the end, what reality should we expect from the revolution?

The book listed several economic shifts:

• A shift toward dramatically greater efficiency:  from centralized, capital-intensive manufacturing facilities to much more efficient production units coordinated by digital industrial platforms

• A shift toward more intense real-time competition : from long, complex supply chains to simpler, highly responsive ones

• A shift toward competition for spheres of influence in an economy without clear industry boundaries : from defined market and industry segments to covering whole industries linked by shared manufacturing materials and methods

• A shift toward digital business ecosystems :  from traditional supply chains to interlocked businesses that share intelligence

• A shift toward collective competition : from competition between companies to a handful of giant pan-industrials

The book covered many more details of the Pan-Industrial Revolution and ways companies can gain a competitive edge in the new era, which I will not discuss here. From reading the book, I feel a sense of excitement for what I may witness and participate in throughout my career. I believe that the change is real and the results will be fruitful, but whether the author’s vision will become a reality depends on whether we strive for such change.

48