22
1. Using resources, regulation and reputation as a basis, what are the sources of pressure on firms such as Frito-lay to reduce their environmental footprint? The effect of industrialization and the increasing population of humanity are the

11.docx

Embed Size (px)

Citation preview

Page 1: 11.docx

1. Using resources, regulation and reputation as a basis, what are the sources of pressure on firms such as Frito-lay to reduce their environmental footprint?The effect of industrialization and the increasing population of humanity are the reasons why natural resources are being overly consumed, outstripping the resource base on an unprecedented scale. The use of resources and

Page 2: 11.docx

subsequently a reduction in operational costs is the primary reason why several companies are going into environmental footprint reduction. Companies like frito-lay, Walmart, Sabaru, Pepsi and Ritz-Carlton would have an increase in OPEX parallel to expansion. Sustainable business spending would also increase rapidly because it is seen as a catalyst for more productive improvements Thus, it is a win-win solution for both humanity, the company and mother earth to reduce resources and lower cost and at the same time, be sustainable. Government agencies are also becoming stricter as more and more studies show that the consumption of men is at an alarming stage. As science advances, there are now several ways to tell how humanity as whole affects the environment. With this, more progressive economies are setting stricter laws that regulate transportation, waste and noise.Lastly, company reputation is also a big factor why companies wanted to reduce environmental footprint as more investors are becoming increasingly interested in the environmental performance of a company where they put their money into. Also, because of customer awareness, large manufacturers and retailers are also using their buy power to push suppliers into more environmentally friendly practices. Green is now in and it is even becoming a trendy lifestyle for many of the consumers, making healthier choices and choosing companies that opt for healthier production as well.

 Identify the specific techniques that Frito-Lay is using to become a "green manufacturer!" 1 answer below » 

Process x 44034production x 30194performance x 19144Marketing x 14151impact x 15973

Frito-Lay, the multi-billion-dollar snack food giant, requires vast amounts of water, electricity, natural gas,

and fuel to produce its 41 well-known brands. In keeping with growing environmental concerns. Frito-Lay

has initiated ambitious plans to produce environmentally friendly snacks. But even environmentally

friendly snacks require resources. Recognizing the environmental impact, the firm is an aggressive "green

manufacturer," with major initiatives in resource reduction and sustainability.

For instance, the company's energy management program includes a variety of elements designed to

engage employees in reducing energy consumption. These elements include scorecards and customized

action plans that empower employees and recognize their achievements.

At Frito-Lay's factory in Casa Grande, Arizona, more than 500,000 pounds of potatoes arrive every day to

be washed, sliced, fried, seasoned, and portioned into bags of Lay's and Ruffles chips. The process

consumes enormous amounts of energy and creates vast amounts of wastewater, starch, and potato

peelings. Frito-Lay plans to take the plant off the power grid and run it almost entirely on renewable fuels

and recycled water. The managers at the Casa Grande plant have also installed skylights in conference

rooms, offices, and a finished goods warehouse to reduce the need for artificial light. More fuel-efficient

ovens recapture heat from exhaust stacks. Vacuum hoses that pull moisture from potato slices to

recapture the water and to reduce the amount of heat needed to cook the potato chips are also being

used.

Frito-Lay has also built over 50 acres of solar concentrators behind its Modesto, California, plant to

generate solar power. The solar power is being converted into heat and used to cook Sun Chips. A

biomass boiler, which will burn agricultural waste, is also planned to provide additional renewable fuel.

Frito-Lay is installing high-tech filters that recycle most of the water used to rinse and wash potatoes. It

also recycles corn byproducts to make Doritos and other snacks; starch is reclaimed and sold, primarily

as animal feed, and left over sludge is burned to create methane gas to run the plant boiler.

There are benefits besides the potential energy savings Like many other large corporations, Frito-Lay is

striving to establish its green credentials as consumers become more focused on environmental issues.

Page 3: 11.docx

There are marketing opportunities, too. The company, for example, advertises that its popular Sun Chips

snacks are made using solar energy.

At Frito-Lay's Florida plant, only 3½% of the waste goes to landfills, but that is still 1.5 million pounds

annually. The goal is zero waste to landfills. The snack food maker earned its spot in the National

Environmental Performance Track program by maintaining a sustained environmental compliance record

and making new commitments to reduce, reuse, and recycle at this facility.

Substantial resource reductions have been made in the production process, with an energy reduction of

21% across Frito-Lay s 34 U.S. plants. But the continuing battle for resource reduction continues. The

company is also moving toward biodegradable packaging and pursuing initiatives in areas such as office

paper, packaging material, seasoning bags, and cans and bottles. While these multiyear initiatives are

expensive, they have the backing at the highest levels of Frito-Lay as well as corporate executives at

PepsiCo, the parent company.

Discussion Questions

1. Using resources, regulation, and reputation as a basis, what are the sources of pressure on firms such

as Frito-Lay to reduce their environmental footprint?

2. Identify the specific techniques that Frito-Lay is using to become a "green manufacturer!'

3. Select another company and compare its green policies to those of Frito-Lay.

How does higher quality lead to lower costs?Posted on June 12, 2012

The answer to how higher quality can lead to lowered costs may seem fairly

obvious. To explain the details behind this idea we should look at the details

behind quality. Heizer & Render (2009) outline the costs of quality, or rather the

costs that can occur if you have poor quality, as follows:

Prevention costs – training your staff to perform their tasks better and

having programs to educate staff on improving quality.

Appraisal costs – the costs of “quality testing” the products produced.

These appraisals can include, but aren’t limited to: product testing,

product/service inspectors, quality assurance labs etc.

Internal failure – this is when a product or service is produced and fails or is

defective. The internal aspect is that this is when the defective/failed

product is detected before delivery to the customers. The product/service

then needs to be scrapped or reworked.

Page 4: 11.docx

External costs – similar to the situation above, usually a defective product

or part of a product which occurs after the sale of the product or service.

The costs involved in recalling, refunding and/or replacing the product or

service.

Heizer & Render (2009) go on to show an example of General Electric’s recall of

3.1 million dishwashers due to a defective part, this recall ended up costing GE

more than the value of all the washing machines. Another example provided by

Heizer & Render (2009) show how Mercedes Benz’ lack of focus on quality led to

a $600 million cost to company spent on warranties for faulty parts in their

vehicles in a single year.

Fortunate to the consumer and perhaps not so fortunate to the organisation is

that products and services can and generally are required to carry some sort of

warranty or guarantee (such as the Consumer Protection Acts in many different

countries – Wikipedia, 2011). If products are faulty or defective then consumers

will return products which need to be repaired or replaced. In the cases where a

warranty/guarantee is not available the lower quality will potentially damage the

reputation of the organisation, which can end up in lost return and future

customers. Heizer & Render also point out that poor quality delivery can also

result in injuries, lawsuits, and increasing government legislation (which can be

costly if processes are required by law).

One of the popular methodologies behind quality management is Six Sigma,

which has the goal of “flawless performance” (TechRepublic, 2003). Six Sigma

was developed by Motorola in the 1980s to deal with consumer complaints and

increasing competition. To outline the processes behind Six Sigma, very broadly,

we can consider them as follows (Heizer & Render, 2009)

Define the purpose, scope and outputs and required processes.

Maintaining the idea of the customers definition of quality

Measure processes and collect data on the processes.

Analyse the collected data and ensure the results are repeatable as well as

reproducible.

Improve the processes by modifying and/or redesigning them

Control the new processes and maintain performance and quality levels.

Page 5: 11.docx

–        DMAIC

Another popular quality assurance protocol is HACCP (Hazard Analysis & Critical

Control Points) used in food safety (FDA, 2011) in countries like the US, UK and

for certain food stores here in South Africa (eg: Woolworths). HACCP can be

summarised as food quality management “through the analysis and control of

biological, chemical and physical hazards from raw material production” (FDA,

2011). My personal experiences dealing with food stores using HACCP and food

stores that don’t use HACCP are like night and day. The quality is unrivalled and

I feel quite safe that I’m practically guaranteed (term is used lightly) good

quality, unspoiled and unmarked foods when using a HACCP controlled food

store. The benefits of this can be equated to those mentioned above – lower

returns (of items), lawsuits and health hazards.

Philip Crosby meant when he said "Quality Is Free"

in 1979 is that a quality program can save a company more money than it costs to implement it. Because whenever we are thinking of the total cost of quality, it includes prevention, appraisal, and failure costs:1. Prevention costs include design reviews, product qualification, drawing checking, engineering quality orientation, quality improvement programs, supplier evaluations, supplier training, specification reviews, process capability studies, tool control, operating training, quality orientations, acceptance planning, quality audits, and preventive maintenance.2. Appraisal costs include prototype inspection and testing, production specification conformance analysis, supplier surveillance, receiving inspection and testing, product acceptance, process control acceptance, packaging inspection, and status measurement and reporting.3. Failure costs include consumer affairs, redesign, engineering change orders, purchasing change orders, corrective action costs, rework, scrap, warranties, after-sales service, and product liability (product recalls like that of Toyota and now GM).So by incorporating that we should be reducing the overall cost for the organization.

Taguchi methodsTaguchi methods (Japanese: タグチメソッド) are statistical methods developed by Genichi

Taguchi to improve the quality of manufactured goods, and more recently also applied to

engineering,[1] biotechnology,[2][3] marketing and advertising.[4] Professional statisticians have

welcomed the goals and improvements brought about by Taguchi methods,[editorializing] particularly by

Taguchi's development of designs for studying variation, but have criticized the inefficiency of some

of Taguchi's proposals.[5]

Taguchi's work includes three principal contributions to statistics:

A specific loss function

The philosophy of off-line quality control; and

Page 6: 11.docx

Innovations in the design of experiments.

Loss functions[edit]

Loss functions in statistical theory[edit]

Traditionally, statistical methods have relied on mean-unbiased estimators of treatment effects:

Under the conditions of the Gauss-Markov theorem, least squaresestimators have minimum

variance among all mean-unbiased estimators. The emphasis on comparisons of means also draws

(limiting) comfort from the law of large numbers, according to which the sample means converge to

the true mean. Fisher's textbook on the design of experiments emphasized comparisons of

treatment means.

However, loss functions were avoided by Ronald A. Fisher.[6]

Taguchi's use of loss functions[edit]

Taguchi knew statistical theory mainly from the followers of Ronald A. Fisher, who also avoided loss

functions. Reacting to Fisher's methods in the design of experiments, Taguchi interpreted Fisher's

methods as being adapted for seeking to improve the mean outcome of a process. Indeed, Fisher's

work had been largely motivated by programmes to compare agricultural yields under different

treatments and blocks, and such experiments were done as part of a long-term programme to

improve harvests.

However, Taguchi realised that in much industrial production, there is a need to produce an

outcome on target, for example, to machine a hole to a specified diameter, or to manufacture

a cell to produce a given voltage. He also realised, as had Walter A. Shewhart and others before

him, that excessive variation lay at the root of poor manufactured quality and that reacting to

individual items inside and outside specification was counterproductive.

He therefore argued that quality engineering should start with an understanding of quality costs in

various situations. In much conventional industrial engineering, the quality costs are simply

represented by the number of items outside specification multiplied by the cost of rework or scrap.

However, Taguchi insisted that manufacturers broaden their horizons to consider cost to society.

Though the short-term costs may simply be those of non-conformance, any item manufactured away

from nominal would result in some loss to the customer or the wider community through early wear-

out; difficulties in interfacing with other parts, themselves probably wide of nominal; or the need to

build in safety margins. These losses are externalities and are usually ignored by manufacturers,

which are more interested in theirprivate costs than social costs. Such externalities prevent markets

from operating efficiently, according to analyses of public economics. Taguchi argued that such

losses would inevitably find their way back to the originating corporation (in an effect similar to

the tragedy of the commons), and that by working to minimise them, manufacturers would enhance

brand reputation, win markets and generate profits.

Page 7: 11.docx

Such losses are, of course, very small when an item is near to negligible. Donald J.

Wheeler characterised the region within specification limits as where we deny that losses exist. As

we diverge from nominal, losses grow until the point where losses are too great to deny and the

specification limit is drawn. All these losses are, as W. Edwards Deming would describe

them, unknown and unknowable, but Taguchi wanted to find a useful way of representing them

statistically. Taguchi specified three situations:

1. Larger the better (for example, agricultural yield);

2. Smaller the better (for example, carbon dioxide emissions); and

3. On-target, minimum-variation (for example, a mating part in an assembly).

The first two cases are represented by simple monotonic loss functions. In the third case, Taguchi

adopted a squared-error loss function for several reasons:

It is the first "symmetric" term in the Taylor series expansion of real analytic loss-functions.

Total loss is measured by the variance. For uncorrelated random variables, as variance is

additive the total loss is an additive measurement of cost.

The squared-error loss function is widely used in statistics, following Gauss's use of the

squared-error loss function in justifying the method of least squares.

Reception of Taguchi's ideas by statisticians[edit]

Though many of Taguchi's concerns and conclusions are welcomed by statisticians and economists,

some ideas have been especially criticized. For example, Taguchi's recommendation that industrial

experiments maximise some signal-to-noise ratio (representing the magnitude of the mean of a

process compared to its variation) has been criticized widely.[citation needed]

Off-line quality control[edit]

Taguchi's rule for manufacturing[edit]

Taguchi realized that the best opportunity to eliminate variation is during the design of a product and

its manufacturing process. Consequently, he developed a strategy for quality engineering that can

be used in both contexts. The process has three stages:

System design

Parameter (measure) design

Tolerance design

System design[edit]

This is design at the conceptual level, involving creativity and innovation.

Page 8: 11.docx

Parameter design[edit]

Once the concept is established, the nominal values of the various dimensions and design

parameters need to be set, the detail design phase of conventional engineering. Taguchi's radical

insight was that the exact choice of values required is under-specified by the performance

requirements of the system. In many circumstances, this allows the parameters to be chosen so as

to minimize the effects on performance arising from variation in manufacture, environment and

cumulative damage. This is sometimes called robustification.

Robust parameter designs consider controllable and uncontrollable noise variables; they seek to

exploit relationships and optimize settings that minimize the effects of the noise variables.

Tolerance design[edit]

Main article: Pareto principle

With a successfully completed parameter design, and an understanding of the effect that the various

parameters have on performance, resources can be focused on reducing and controlling variation in

the critical few dimensions.

Design of experiments[edit]

Taguchi developed his experimental theories independently. Taguchi read works following R. A.

Fisher only in 1954. Taguchi's framework for design of experiments isidiosyncratic and often flawed,

but contains much that is of enormous value.[citation needed] He made a number of innovations.

Outer arrays[edit]

Taguchi's designs aimed to allow greater understanding of variation than did many of the traditional

designs from the analysis of variance (following Fisher). Taguchi contended that

conventional sampling is inadequate here as there is no way of obtaining a random sample of future

conditions.[7] In Fisher's design of experiments andanalysis of variance, experiments aim to reduce

the influence of nuisance factors to allow comparisons of the mean treatment-effects. Variation

becomes even more central in Taguchi's thinking.

Taguchi proposed extending each experiment with an "outer array" (possibly an orthogonal array);

the "outer array" should simulate the random environment in which the product would function. This

is an example of judgmental sampling. Many quality specialists have been using "outer arrays".

Later innovations in outer arrays resulted in "compounded noise." This involves combining a few

noise factors to create two levels in the outer array: First, noise factors that drive output lower, and

second, noise factors that drive output higher. "Compounded noise" simulates the extremes of noise

variation but uses fewer experimental runs than would previous Taguchi designs.

Page 9: 11.docx

Management of interactions[edit]

Interactions, as treated by Taguchi[edit]

Many of the orthogonal arrays that Taguchi has advocated are saturated arrays, allowing no scope

for estimation of interactions. This is a continuing topic of controversy. However, this is only true for

"control factors" or factors in the "inner array". By combining an inner array of control factors with an

outer array of "noise factors", Taguchi's approach provides "full information" on control-by-noise

interactions, it is claimed. Taguchi argues that such interactions have the greatest importance in

achieving a design that is robust to noise factor variation. The Taguchi approach provides more

complete interaction information than typical fractional factorial designs, its adherents claim.

Followers of Taguchi argue that the designs offer rapid results and that interactions can be

eliminated by proper choice of quality characteristics. That notwithstanding, a "confirmation

experiment" offers protection against any residual interactions. If the quality characteristic

represents the energy transformation of the system, then the "likelihood" of control factor-by-

control factor interactions is greatly reduced, since "energy" is "additive".

Inefficiencies of Taguchi's designs[edit]

Interactions are part of the real world. In Taguchi's arrays, interactions are confounded and

difficult to resolve.

Statisticians in response surface methodology (RSM) advocate the "sequential assembly"

of designs: In the RSM approach, a screening design is followed by a "follow-up design" that

resolves only the confounded interactions judged worth resolution. A second follow-up design may

be added (time and resources allowing) to explore possible high-order univariate effects of the

remaining variables, as high-order univariate effects are less likely in variables already eliminated for

having no linear effect. With the economy of screening designs and the flexibility of follow-up

designs, sequential designs have great statistical efficiency. The sequential designs ofresponse

surface methodology require far fewer experimental runs than would a sequence of Taguchi's

designs.[8]

Analysis of experiments[edit]

Taguchi introduced many methods for analysing experimental results including novel applications of

the analysis of variance and minute analysis.

Assessment[edit]

Genichi Taguchi has made valuable contributions to statistics and engineering. His emphasis on loss

to society, techniques for investigating variation in experiments, and his overall strategy of system,

parameter and tolerance design have been influential in improving manufactured quality worldwide.

Page 10: 11.docx

[9] Although some of the statistical aspects of the Taguchi methods are disputable, there is no dispute

that they are widely applied to various processes. A quick search in related journals, as well as the

World Wide Web, reveals that the method is being successfully implemented in diverse areas,such

as the design of VLSI; optimization of communication & information networks, development of

electronic circuits, laser engraving of photo masks, cash-flow optimization in banking, government

policymaking, runway utilization improvement in airports, and even robust eco-design.[10]

What roles do operations managers play in addressing the major aspects of service quality?Service quality is a comparison of expectations with performance. 

A business with high service quality will meet customer needs whilst remaining economically competitive. Improved service quality may increase economic competitiveness. 

This aim may be achieved by understanding and improving operational processes; identifying problems quickly and systematically; establishing valid and reliable service performance measures and measuring customer satisfaction and other performance outcomes

What roles do operations managers play in addressing the major aspects of service quality?What roles do operations managers play in addressing the major aspects of service quality?

Thanks for helping me with this! Oh, please include references, too.

Solution Preview

ROLES OF OPERATIONS MANAGERS IN SERVICE ORGANIZATIONS

The operations manager is in charge of the design, operations, and control of the process that is involved in transforming resources into finished goods or services. He is therefore involved in transforming inputs into outputs, for the success of the company. The inputs include: people, technology, capital, equipment, materials, information, and others. The outputs are either or both products and services.

According to Robbins (2005), decisions made in relation to operations management issues determine the likelihood of success of an organization. Operations management encompasses both manufacturing and service organizations. This indicates the importance of the role of an operations manager. He sees to it that inputs, in the first place, should already be of good quality. Human resources are carefully selected to ensure that they possess the knowledge and skills 

what are 10 determinants of service quality ?

Determinants of Service Quality [1]

 

Page 11: 11.docx

1.  Reliability: Consistency of performance and dependability. Many of the factors promoting  reliability are common to overall success. We employ back up systems and personnel to insure that an adequate supply of workers are available to complete the job.

           

2.        Responsiveness: Willingness and readiness to perform services. Our managers, supervisors and all personnel are encouraged to work under a “spirit of service”. We understand that our customer’s would not need our service, if they never had problems. We teach our personnel to understand and appreciate the term “job security”. 

 

3.        Competence: Possession of skills and knowledge to perform. Our management team benefits from some of the most knowledgeable resources in the business. Our cleaning experience is unsurpassed! We are confident that there is no cleaning situation that we cannot manage!

 

4.        Understanding: Knowing the customer's needs and requirements. We know how to listen.

 

5.        Access: Approachability and ease of access to management. As owner managers Bobby Blain and Scott Patterson interact daily with customers and operations. We return phone calls.

 

6.        Communication:   Providing the customer with effective information. We retrieve a huge amount of information from our operations’ personnel. We have a number of effective means for passing this information on to our customers.

 

Page 12: 11.docx

7.        Courtesy:  Friendliness of personnel and ownership. We know how to handle complaints. We strive to be “peacemakers not troublemakers”. We find answers, not excuses.

 

8.        Credibility:  Trust and personal characteristics of personnel. We have experienced recruiters who ask pertinent questions, check references and conduct background checks on all new hires.

 

9.        Security:  Safety, financial security, and confidentiality.

 

10.    Tangibles: Physical evidence of service. Reports, inspections. . We want our customers to know what we are doing for them, so we have a very elaborate communications system that includes: a bar code inspection program, a verbal report translatable into E-mail (for an early morning breakdown of evening activity), and a computerized schedule process, so that our customers know when to expect specific work items.

 

According to surveys customers expect quality to be served at three distinct levels.

 Customer Performance Levels[2] “We can do what we say!”

 

1.       The implicit  or  “expected “ level of cleaning must be determined at the point of sale. Indeed, within our system “Quality begins at the point of sale”! An accurate assessment and bid are an absolute necessity. We are among the most experienced janitorial bid constructors in the nation. We use a custom designed, experience based spreadsheet to prepare all bids. Mr. Blain has consulted on large bids throughout the country. (Most recently for the Austin Bergstrom International Airport).

 

Page 13: 11.docx

2.      Explicit  or negotiated value extras, require the experience of a highly professional bidder. You can trust our company to recommend   only those services that are beneficial to the building o3. Latent: Unexpected service and performance.

 

3.      Providing latent, unexpected service or performance is a main goal of our companies. We want long term relationships with few complaints, and we are willing to spend the time and money necessary to insure those characteristics.

 

Response to Complaints[3] “We want to be part of the solution, not part of the problem.”

      

1.      When corrective actions do not meet customers expectations (the problem is exacerbated.) We make our customers feel comfortable when they are making us aware of a problem. We strive not to be defensive. We want to be problem solvers not problems.

 

2.        Corrective actions must meet the customer’s expectations so (the problem is neutralized.)

 

3.      Corrective actions should exceed customer expectations so that (the problem is converted into a positive experience.) Our main goal is to exceed customer expectations at every location.

Page 14: 11.docx
Page 15: 11.docx

1.How does Darden build quality into the supply chain?

Page 16: 11.docx

The Darden built a systematic program that is called “Farm to Fork” to buildcomprehensive quality program. It addressesThe quality of food source by evaluation, selection, development ofsourcesCommitment along the processing and logistics chainThe standards of delivery, preparation, and serving at the restaurant.Hygiene and monitoring both the hot and cold temperature standardsalong each step to delivery to the final customer’s fork.

2.Select two potential problems- one in the Darden supply chain and one inrestaurant- that can be analyzed with a fish-bone chart. Draw a completechart to deal with each problem.In the restaurant, the problem is employee getting to Work Late

Darden applies SPC in many product attributes. Identify where these are probably used.Three of the SPC charts used are:1. X-bar charts in measuring the average weight of food products2. R charts to show the range between the heaviest and the lightest of the individual products3. Capability histogram used to show the distribution of the weightsThese charts are also used to document precooked food weights; meat, seafood and poultry temperatures;blemishes on produce; and bacteria counts on shrimp.4.The SPC chart on page 224 illustrates Darden's use of control charts to monitor the weight of salmonfillets. Given these data, what conclusion do you, as a Darden quality control inspector, draw? Whatreport do you issue to your supervisor? How do you respond to the salmon vendor?•The sample means are within UCL and LCL as indicated in the X-bar chart.All the sample means are close to the mean of the sample means except sample 11 with a sample mean lyingon the LCL.•The R chart indicates that sample 11 has a range/dispersion outside the control limits meaning themeasuring process (to measure the weight of salmon fillets) is out-of-control/not stable.•The process is capable, since the sample weights are laying between the specification limits asindicated by the capability histogram. Although the process sigma is 5.31 now, the aim is always 6

Page 17: 11.docx
Page 18: 11.docx