The House of High Quality Articles for Everyone in the World

Jul 31, 2010

Project Management

Project management is a structure to any project and although its central aim is based on schedule, tasks and quality, it also is structured around three broad themes; time, finance and scope.

So projects must be able to meet all quality controls (customer satisfaction). Projects must be delivered or completed on time and also be delivered within the agreed financial package and they must be within the agreed scope of the project, so that the final project satisfies the customer or client.

A project is a piece of work that has a time framework so it will start, run and then come to completion and the management process for this, which is undertaken in a formal way is the process of project management.

The Project Team

Although many projects will have a project manager who deals with the day to day management of a project, there is also a need for variety of team players, from various disciplines to all work together to make the process work. This team also has to effectively implement a risk strategy that minimises any risks associated with the project.

So a project that involves the construction of a building will have a project manager who will then organise the project team. This team will involve ‘stakeholders’ who will own the building when it is used, as well as people who are designing the building and the people who are building it.

The project team is therefore disparate, but all members have an interest in ensuring that the project is delivered on time, with the project manager having a pivotal role in driving forward the project and ensuring that all team members work together, achieve agreed goals and communicate effectively.

Role of Steering Group

Some projects, although by no means all, also have a Steering Group to help drive the process forward. A Steering Group will have a role where the project team report to the Group on a regular basis and the Steering Group have an overall management responsibility to ensure that the project is managed effectively and that it delivers successfully.

Having a Steering Group helps to form the structure of project management and ensure that all deadlines are met, as the process is underway.

Projects Requiring Project Management

Although huge contracts such as the building of the stadiums and accommodation for the Olympics, obviously require very formal project management, the process is used throughout business and even within the voluntary and public sectors.

Because project management manages to formalise the process of completing a project successfully it means that there are fewer risks associated with a project that is formally managed and that is why it is so valuable a process. If a project is subject to project management, it is more likely to succeed and less likely to go well over budget!

So any project, whether it is building the Olympic village, or opening a new school or a takeaway on a street corner actually requires project management to ensure that it opens on time and within budget and that it is the best that it can be. That is the whole philosophy of project management!



read more »

Jul 26, 2010

Theory of Constraints (TOC)

This is a system improvement tool, which states that every system has one goal and it is achieved by many linked processes and out of those processes, one process acts like a bottleneck. So, this theory is utilized in order to remove that bottleneck and ultimately achieve higher productivity.

Commonly these bottle necks can be easily understood by the example of assembling operation. There are 5 stations; each station has different times for assembling e.g 40 minutes, 50 and 60 minutes. Out of these stations, station with maximum operation time will be considered as bottleneck station, because at this station, a queue will be generated and ultimately it will affect productivity of system.
Concept of Theory of Constraint was given by Eliyahu Moshe Goldratt. Basically he was a physicist and work on production system design.
There are many benefits of TOC application, like

•Enhanced ability to update system and processes and ultimately winning more profit margins and business.
•Less amount of System problems
•Reduced lead times for production at shop floor
•Reduced lead time for service organizations, like hospitals
•Less inventory, especially work in process inventory
•On time delivery of product or service
•Motivated and committed employees for problem solving
•Enhanced competiveness
•Reduced production related variable costs
•Helps in process understanding
•Improved communication level between different directorates
To apply TOC in any organization requires following steps:

•Identification of the constraint
•Make a strategy to exploit the constraint
•Next make a strategy that how to subordinate the constraint
•In the next step, elevate the constraint performance
•Move back to step one to solve other bottlenecks
This five steps strategy is used as generic solution for problems in assembly lines, design & development, manufacturing, decision making, supply chain management, health care etc.

In assembly lines, parts are coming on a conveyer and on each station; they take some time for assembly. The station with maximum working time is considered as bottleneck station and using TOC, problem can be solved by applying different solutions.

In design & development phase, products are designed using different tools. When they are tested, they show different problems. This tool helps to identify the problematic areas and sequentially provides the solution.

In manufacturing, products are manufactured using different technologies and some time one workstation or technology creates problems. To solve manufacturing related problems this tool is much useful.

In healthcare, patients are treated at different places. During treatment patients has to pass through different rooms, like emergency room, medical checkup room, pathology test room, Operation Theater and even reception. So, in these all areas, one station takes more time and hence ultimately patients have to wait more. If that customer care area is streamlined using TOC, the whole system will improve its efficiency and productivity.

Theory of Constraint is a good tool and it should be applied in respective disciplines in systematic manner. Once good results are obtained, then different bottleneck stations must be identified. One important thing is that this tool must be applied there as continuous process.


read more »

Jul 23, 2010

Where did the name "Six Sigma" come from? .

There are two questions have dominated the field of six sigma. The first one, can be described by the global question: “Why 6s and not some other level of capability?” The second inquiry is more molecular. It can be summarized by the question: “Where does the 1.5s shift factor come from – and why 1.5 versus some other magnitude?” For details on this subject, reference: “Harry, M. J. “Resolving the Mysteries of Six Sigma: Statistical Constructs and Engineering Rationale.” First Edition 2003. But until then, we will consider the following thumbnail sketch.

At the onset of six sigma in 1985, this writer was working as an engineer for the Government Electronics Group of Motorola. By chance connection, I linked up with another engineer by the name of Bill Smith (originator of the six sigma concept in 1984). At that time, he suggested Motorola should require 50 percent design margins for all of its key product performance specifications. Statistically speaking, such a "safety margin" is equivalent to a 6 sigma level of capability.

When considering the performance tolerance of a critical design feature, he believed a 25 percent “cushion” was not sufficient for absorbing a sudden shift in process centering. Bill believed the typical shift was on the order of 1.5s (relative to the target value). In other words, a four sigma level of capability would normally be considered sufficient, if centered. However, if the process center was somehow knocked off its central location (on the order of 1.5s), the initial capability of 4s would be degraded to 4.0s – 1.5s = 2.5s. Of course, this would have a consequential impact on defects. In turn, a sudden increase in defects would have an adverse effect on reliability. As should be apparent, such a domino effect would continue straight up the value chain.

Regardless of the shift magnitude, those of us working this issue fully recognized that the initial estimate of capability will often erode over time in a “very natural way” – thereby increasing the expected rate of product defects (when considering a protracted period of production). Extending beyond this, we concluded that the product defect rate was highly correlated to the long-term process capability, not the short-term capability. Of course, such conclusions were predicated on the statistical analysis of empirical data gathered on a wide array of electronic devices.

Thus, we come to understand three things. First, we recognized that the instantaneous reproducibility of a critical-to-quality characteristic is fully dependent on the “goodness of fit” between the operating bandwidth of the process and the corresponding bandwidth of the performance specification. Second, the quality of that interface can be substantively and consequentially disturbed by process centering error. Of course, both of these factors profoundly impact long-term capability. Third, we must seek to qualify our critical processes at a 6s level of short-term capability if we are to enjoy a long-term capbility of 4s.

By further developing these insights through applied research, we were able to greatly extend our understanding of the many statistical connections between such things as design margin, process capability, defects, field reliability, customer satisfaction, and economic success.



read more »

Jul 20, 2010

Lean Six Sigma Methodologies

Lean Six Sigma methodologies focus a lot on waste. This causes a business to spend too much money on labor and even doubles the efforts which can slow down production. Waste also includes making unnecessary purchases or processes. When someone goes through lean Six Sigma certification courses they learn how to identify the waste and reduce it as much as possible. This can help a business not only cut down on cost, but also improve productivity with employees.

Bottlenecks occur and thus slow down a business process significantly. When employees attend lean Six Sigma certification courses, they are taught how to identify bottlenecks within an entire organization all the way down to specific processes. Having the capability of identifying bottlenecks gives your company the opportunity to eliminate them and maximize productivity. Lean Six Sigma courses provide excellent training with thinking critically and coming up with creative solutions to fix problems.

Customer satisfaction is another focus on lean Six Sigma methodologies. Businesses that have noticed they have had a large loss of customers often send their staff members to lean Six Sigma certification courses to turn things around. Lean Six Sigma training will help employees understand and identify the needs of a customer and how to better meet their needs. This will help you as a business achieve and build better customer relationships and increase your customer base.

Lean Six Sigma methodologies are designed for improving business processes in many ways. Your business can benefit by eliminating waste, bottlenecks slowing down productivity, and improving customer satisfaction. These courses can be taken online or in a classroom. They will benefit the employee but also your business. You will find that better productivity is a result along with higher revenues in the end.



read more »

Jul 17, 2010

The History of Six Sigma

Six Sigma has evolved over time. The concepts behind Six Sigma (problem solving technique, 7 basic tools etc) can be traced through the centuries as the method took shape into what it is today.

The roots of Six Sigma as a measurement standard can be traced back to Carl Frederick Gauss (1777-1855) who introduced the concept of the normal curve. Six Sigma as a measurement standard in product variation can be traced back to the 1920's when Walter Shewhart showed that three sigma from the mean is the point where a process requires correction. Many measurement standards (Cpk, Zero Defects,ppm etc.) later came on the scene but credit for coining the term "Six Sigma" goes to a Motorola engineer named Bill Smith. (Incidentally, "Six Sigma" is a federally registered trademark of Motorola).

In the early and mid-1980s with Chairman Bob Galvin at the helm, Motorola engineers decided that the traditional quality levels -- measuring defects in thousands of opportunities -- didn't provide enough granularity. Instead, they wanted to measure the defects per million opportunities. Motorola developed this new standard and created the methodology and needed cultural change associated with it. Six Sigma helped Motorola realize powerful bottom-line results in their organization - in fact, they documented more than $16 Billion in savings as a result of our Six Sigma efforts.

Since then, hundreds of companies around the world have adopted Six Sigma as a way of doing business. This is a direct result of many of America's leaders openly praising the benefits of Six Sigma. Leaders such as Larry Bossidy of Allied Signal (now Honeywell), and Jack Welch of General Electric Company. Rumor has it that Larry and Jack were playing golf one day and Jack bet Larry that he could implement Six Sigma faster and with greater results at GE than Larry did at Allied Signal. The results speak for themselves.

Six Sigma has evolved over time. It's more than just a quality system like TQM or ISO. It's a way of doing business to reduce the process variation. As Geoff Tennant describes in his book Six Sigma: SPC and TQM in Manufacturing and Services: "Six Sigma is many things, and it would perhaps be easier to list all the things that Six Sigma quality is not. Six Sigma can be seen as: a vision; a philosophy; a symbol; a technique; a metric; a goal; a methodology." We couldn't agree more.


read more »

Jul 12, 2010

Control Chart

A control chart is major tool of the 7 basic quality control tools. It can called the Shewhart chart or the process-behaviour chart, but they are all the same thing; a control chart.

The control chart is a means of showing whether or not any process is stable. When it is stable, then it will be under control. However, when any process is done over and over again, then there may be times when it does not repeat exactly and the control chart is a type of visual management that can display the results.

Processes may have slight variations and the control chart can show whether or not the variances derive from sources that are in essence common to the process. If the chart shows that the process is not stable, then the chart can help to pinpoint the causes and sources of the variances, so that they can be eliminated.

The control chart will have 3 horizontal lines, namely the Upper Control Limit (UCL) the Average Line (often called the Mean) and the Lower Control Limit (LCL).

The Average line, the mean is the central line. The other two lines are at 3 average deviations at each side of the Average Line. Obviously, the more stable a process is, the closer to the Average Line it will be.

Calculations Of Control

Within the chart there has to be a method of calculating the control limits. Variables such as time, weight or even voltage can be used. If the question ‘How much?’ is asked, then a variable will be used as the measurement. Any non-variable measurements are referred to as attributes and these are the numbers of an item, such as the number of defective goods.

However, when a point is shown on a control chart, it is usually comprised of the average or mean of a set of measurements. This is due to statistical reasons, for predicting the distribution but also enables much tighter limits to be achieved in terms of control. Averaging a number out ensures that any individual high or low measurements can be smoothed out. This results in the control chart being able to reflect any very small changes in any given process. If the points were all individual numbers, there would be too many to present a realistic picture and it would be incredibly hard to accurately pinpoint any tiny changes.

Tracking Variables

When tracking variables, it is necessary to use two control charts, simply because if only the average of some subgroups were measured, there could be a really significant variance within the different subgroups that could be easily missed. So in order to accurately track the variances, two control charts need to be undertaken.

Use of the Control Chart

The control chart is an exceptionally useful tool, particularly within the manufacturing business, simply because it enables manufacturers to be able to analyse objectively, the behaviour of any process. Although we may think that creating a ‘widget’ 1000 times an hour, in exactly the same way will lead to the exact process being repeated, there will be slight variations. These may be slight, but it is important to chart them so that they can be kept within ‘control’ because if the process deviates from control limits, the machines could be affected, the defect rate could be high and so on.

In essence, the control chart helps keep any process variations exceptionally low and they pick up on any changes very quickly. This ensures that any potential problems can be solved before they become problems, which is the goal of any quality tool and one which the control chart does exceptionally well!


read more »

Jul 8, 2010

The 5 Whys Technique

The 5 Whys technique is an important problem solving technique, it is evolved from the Toyota Production System. that is used in all kinds of problems, not just manufacturing, to be able to solve problems.

The beauty of the 5 Whys technique is that it is incredibly straightforward and easily applied to problems. It is also frighteningly simple.

The technique is to take a problem, look at it in reverse i.e. look at the end product or result and then work backwards and at each stage ask the question “Why?” or, if this is not appropriate, ask “What caused this problem?”

Example Of the 5 Whys Technique

To illustrate how the technique can and should be applied, an example of its application is the best illustration.

In a scenario where a customer has rung up to complain that some of the items they ordered were faulty, the 5 Whys can be easily applied.

The end result is that the customer was supplied faulty goods and is unhappy with the standard of service that we provided.

1. Why did the customer get faulty goods?

Obviously some of the goods that were produced were not inspected for defects.

2. Why were all goods not inspected to insure that they were fit for purpose and that they could pass quality control?

Supervisors have not been applying quality control methods.

3. Why have supervisors not been applying these methods?

Due to pressure in terms of meeting large orders and some of the machinery being ‘down’ due to breaking down, supervisors have been under enormous pressure and have not been able to perform satisfactorily.

4. Why did some of the machines break down?

The routine maintenance that was scheduled for two months ago did not happen and as a result, some of the machines were not in perfect working order.

5. Why was the routine maintenance cancelled?

Due to suddenly having several large orders in, the decision was taken that routine maintenance could be put back for three months.
This information flow shows up three important issues;

1. There are peaks in production which means that at times the supervisors are too busy and that routine maintenance is not being carried out.
2. During peak times it is obvious that the supervisors are working flat out and whilst this may enable deadlines and timescales to be met, it means that quality control is suffering.
3. The routine maintenance is obviously required to ensure that the machines are in good working order and will not suddenly break down.

The obvious answers to these problems are to try and level out any demands in production so that there are less really busy times and supervisors are under less pressure.

More supervisors should be appointed to ensure that quality control standards can be met.

Routine or preventative maintenance should be carried out to keep the machines operational and reduce the amount of ‘downtime’ due to machines being broken.

Thus the technique of simply solving a problem by looking at it and constantly asking “Why?” is in fact a very effective one.

Usually it is limited to 5Whys because if you have asked “Why” 5 times but are nowhere near finding an answer, then it is likely that this is too complex a problem to be solved by this technique.

However it is speedy and even if the problem is too complex and requires a more in depth analysis, hardly any time is wasted at least trying out the 5 Whys approach. It really is remarkably good and there are in fact, a lot of problems that can be solved simply by repeatedly asking the pertinent question “Why?”



read more »

Jul 2, 2010

SIPOC diagram

When starting an improvement project – many make the mistake of not understanding the end-to-end process which can result in a failure to understand all the issues and develop robust improvement plans. A SIPOC diagram is such a map that pulls together input and output information regarding the business process

When to use a SIPOC diagram

A SIPOC diagram should be used during the first stages of an improvement program as a means of capturing sufficient detail to be able to convey the process.

What does Sipoc stands for?

Supplier – Significant Suppliers to the process
Input – Inputs to the process including resources such as materials and personnel
Process – Process Flow map
Output – Key outputs – this could be widgets, data, reports, metrics etc
Customer – The customers to the process – include anyone that receives one of the details outputs.

How to Build a SIPOC diagram

SIPOC diagrams are often bourne out of brainstorming sessions – the improvement team will typically start with the process map and work outwords – brainstorming key inputs and outputs, customers and suppliers. Elements of each part of the SIPOC diagram can then be prioritized.

SIPOC diagrams are beneficial they

• are quick and easy to produce – they require know specialist tools
• require a team approach – bring together stakeholders from the process
• Present the process and key issues/elements in an easy to understand way
• Provides an easy method to review the process


read more »

Jul 1, 2010

The Quality Loss Function

The quality loss function,(QLF), is a quality management tool. In traditional quality management systems anything that does not meet these standard is seen as a defect and rejected.

But in the thinking that is QLF, not all defects are equal; there are different types of defects, depending on how their impact. Some defects could also be seen as unimportant; what matters is defects on issues that affect the performance of a product or the customer’s level of satisfaction.

QLF in Practice

QLF awards a financial or monetary value to the dissatisfaction that customers experience when a product fails to perform.

It also awards a financial or monetary value to the escalating costs that are incurred when product performance exceeds target performance.

The secret to the appeal of QLF is that it embraces the concept of using design, robust design in fact, to build in quality; so quality is not done by inspection, but is built in to any product.

Principles Behind QLF

The Japanese electrical engineer, Genichi who developed QLF, asserted that there were 4 golden rules when it comes to quality.

1. Costs cannot be reduced without quality being affected (usually detrimentally).
2. Quality can be improved without costs increasing, if it is done correctly.
3. Overall costs can be reduced by improving the quality of a product.
4. Costs can be reduced by decreasing any variations from the norm. When variations are eliminated both quality and performance will be automatically improved.

Hence it makes sense to ensure that design issues are addressed, so that quality becomes inherent to a product.
The purpose of the QLF is to work with designers to ascertain the factors or characteristics that affect the performance of a product. Similarly any unimportant characteristics can be identified. The aim is therefore to ensure that variation is reduced with regard to the important characteristics and variance could be accepted if the characteristics are not important.

Assessing QLF

There are 3 general approaches to QLF to help assess quality.

Nominal Is Better: under this assessment the aim is to have a product that is as near as possible to the target value. In a sense it does not matter whether any variation is above or below the target value; what matters is stopping variation.

Larger Is Better Or Smaller is Better:

These two approaches attach a value to some characteristic, in the larger is better approach the company seeks to have a higher value of the characteristic (and the reverse in smaller is better). Keeping variations low is desirable in either approach, because variance from the value of the characteristic leads to increased losses.

QLF and Costs

QLF actually reduces costs, because it enables designers and managers to make financial decisions during the design process and ensures that the product is designed robustly, so quality is inherent.
QLF also moves the average of distribution nearer to the target value, so the target value becomes more achievable.

Through reducing any variances, costs are kept minimal, so it is a very effective tool to ensure that quality issues are addressed.

Use of QLF

QLF is increasingly being used by corporate manufacturing companies who are keen to ensure that they can achieve target values and ensure quality is maintained throughout the production process.
It can give companies a good competitive edge and in light of the recent economic downturn, it is seen as being important in terms of ensuring defects are minimal and thereby waste is reduced whilst customer satisfaction is kept high.



read more »