This lean Six Sigma book of knowledge table of contents summarizes the topics for a 1100+ page book, which follows and provides the details for executing the enhanced Integrated Enterprise Excellence (IEE) business management’s lean Six Sigma Define-Measure-Analyze-Improve-Control (DMAIC) roadmap improvement projects (that **benefit the big picture**). The book title is *Integrated Enterprise Excellence Volume III – Improvement Project Execution: A Management and Black Belt Guide for Going Beyond Lean Six Sigma and the Balanced Scorecard. *

This lean Six Sigma book of knowledge is useful for both teaching lean Six Sigma classes and students’ before and after training reference. **An instructor guide for teaching from this book is available.**

The book is divided into parts for each of the Lean Six Sigma DMAIC (Define-Measure-Analyze-Improve-Control) phases. The first part of the book describes a system for project selection using an Enterprise DMAIC (E-DMAIC) roadmap approach, which is described in more details in IEE Volume II of this series. Part II initiates the step-by-step details of the Project DMAIC (P-DMAIC) roadmap.

## Lean Six Sigma Book of Knowledge Table of Contents

**Part I Integrated Enterprise Excellence (IEE) Management System and E-DMAIC**

**1. Background **

- Messages in Volume 2 and Part I of this volume
- Messages in Part II – Part VI of this volume
- Volume layout
- The IEE System
- Six Sigma and Lean Six Sigma
- Traditional performance metrics can stimulate the wrong behavior
- Characteristics of a good metric
- Traditional scorecards, dashboards, and performance metrics reporting
- Strategic planning
- The balanced scorecard
- Red-yellow-green scorecards
- Example 1.1: Tabular red-yellow-green scorecard reporting alternative

**2. Creating an Integrated Enterprise Excellence (IEE) System**

- Overview of IEE
- IEE as a business strategy
- Applying IEE

**3. Enterprise Define-Measure-Analyze-Improve-Control (E-DMAIC)**

- E-DMAIC – Roadmap
- E-DMAIC – Define and measure phase: Enterprise process value chain
- E-DMAIC – Technical aspects of satellite-level and 30,000-foot-level charting
- E-DMAIC – Example 3.1: Satellite-level metrics
- E-DMAIC – Example 3.2: 30,000-foot-level metric with specifications
- E-DMAIC – Analyze phase: Enterprise process goal setting
- E-DMAIC – Analyze phase: Strategic analysis and development
- E-DMAIC – Analyze phase: Theory of constraints (TOC)
- E-DMAIC – Example 3.3 Theory of Constraints
- E-DMAIC – Analyze phase: Lean tools and assessments
- E-DMAIC – Analyze phase: Identification of project opportunities
- E-DMAIC – Improve phase
- E-DMAIC – Control phase
- E-DMAIC – Summary

**Part II Improvement Project Roadmap: Define Phase**

** 4. P-DMAIC – Define Phase**

- P-DMAIC roadmap component
- Process and metrics
- Supplier-input-process-output-customer (SIPOC)
- Project valuation, cost of poor quality, and cost of doing nothing differently
- Define phase objective
- Primary project metric
- Problem statement
- Secondary project metrics
- Project charter
- Applying IEE
- Exercises

**5. P-DMAIC – Team Effectiveness**

- P-DMAIC roadmap component
- Orming model
- Interaction styles
- Making a successful team
- Team member feedback
- Reacting to common team problems
- Applying IEE
- Exercises

**Part III Improvement Project Roadmap: Measure Phase**

** 6. P-DMAIC – Measure Phase (Plan Project and Metrics): Voice of the customer and in-process Six Sigma Metrics**

- P-DMAIC roadmap component
- Project customer definition and information sources
- Example 6.1: Project customer identification
- Project Voice of the Customer (VOC)
- In-process metrics: Overview
- In-process metrics: Defects per million opportunities (DPMO)
- In process metrics: Rolled throughput yield (RTY)
- In-process metrics: Applications
- Exercises

**7. P-DMAIC – Measure Phase (Plan project and metrics): Project Plan**

- P-DMAIC roadmap component
- Project management
- Project management: Planning
- Project management: Measures
- Example 7.1:CPM/Pert
- Applying IEE
- Exercises

**8. Response Statistics, Graphical Representations, and Data Analysis**

- Continuous versus attribute response
- Time-series plot
- Example 8.1: Time-series plot of gross revenue
- Example 8.2: Culture firefighting or fire prevention?
- Measurement scales
- Variability and process improvements
- Sampling
- Simple graphic presentations
- Example 8.3: Histogram and dot plot
- Sample statistical (mean, range, standard deviation, and median)
- Descriptive statistics
- Pareto charts
- Example 8.4: Improving a process that has defects
- Population distribution: Continuous response
- Normal distribution
- Example 8.5: Normal distribution
- Probability plotting
- Interpretation of probability plots
- Example 8.6: PDF, CDF, and then a probability plot
- Probability plotting censored data
- Weibull and exponential distribution
- Lognormal distribution
- Example 8.7: Comparing distributions
- Distribution application and approximations
- Applying IEE
- Exercises

**9. Attribute Response Statistics**

- Attribute versus continuous data response
- Visual inspections
- Binomial distribution
- Example 9.1: Binomial distribution – Number of combinations and rolls of die
- Example 9.2: Binomial – probability of failure
- Hypergeometric distribution
- Poisson distribution
- Example 9.3: Poisson distribution
- Population distributions: Applications, approximations, and normalizing transformations
- Applying IEE
- Exercises

**10. Traditional Control Charting and IEE Implementation**

- Monitoring processes
- Statistical process control charts
- Interpretation of control chart patterns
- x-bar and R and x-bar and s charts: Mean and variability measurements
- Example 10.1 x-bar and R chart
- XmR and individuals control chart: Individual measurements
- Example 10.2:XmR charts
- p chart: Proportion nonconforming measurements
- Example 10.3: P chart
- np chart: number of nonconforming items
- c chart: Number of nonconformities
- u chart: Nonconformities per unit
- Notes on the Shewhart control chart
- Rational subgroup sampling and IEE
- Applying IEE
- Exercises

**11. Traditional Process Capability and Process Performance Metrics**

- Process capability indices for continuous data
- Process capability indices: Cp and Cpk
- Process capability/performance indices: Pp and Ppk
- Process capability/performance misunderstandings
- Confusion: Short-term versus long-term variability
- Calculating standard deviation
- Example 11.1: Process capability/performance indices
- Process capability/performance for attribute data
- Exercises

**12. P-DMAIC – Measure Phase (Baseline Project): IEE Process Predictability and Process capability/performance metric Assessment (Continuous Response)**

- P-DMAIC roadmap component
- Satellite-level view of the organization
- 30,000-foot-level, 20,000-foot-level, and 50-foot-level operational and project metrics
- IEE application examples: Process predictability and process capability/performance metric
- Traditional versus 30,000-foot-level control charts and process capability/performance metric assessments
- Traditional control charting problems
- Discussion of process control charting at the satellite-level and 30,000-foot-level
- IEE process predictability and process capability/performance metric: Individual samples with specifications
- Example 12.1: IEE process predictability and process capability/performance metric: Individual samples with specifications
- IEE process predictability and process capability/performance metric: Multiple samples in subgroups where there are specification requirements
- Example 12.2: IEE process predictability and process capability/performance metric
- Multiple samples in subgroups where there are specification requirements
- Example 12.3: IEE individuals control chart of subgroup means and standard deviation as an alternative to traditional x-bar and R chart
- Example 12.4: The implication of subgrouping period selection on process stability statements
- Describing a predictable process output when no specification exists
- Example 12.5: Describing a predictable process’ output when no specification exists
- Non-normal distribution prediction plot and process capability/performance metric reporting
- Example 12.6: IEE process predictability and process capability/performance metric – non-normal distribution using Box-Cox transformation
- Example 12.7: IEE process predictability and process capability/performance metric – non-normal distribution with zero and/or negative values
- Non-predictability charts and seasonality
- Value chain satellite-level and 30,000-foot-level example metrics
- Example 12.8: Value chain metric computations – Satellite-level metric reporting
- Example 12.9: Value chain metric computations – 30,000-foot-level metric with specifications
- Example 12.10: Value chain metric computations – 30,000-foot-level continuous response metric with no specifications
- IEE difference
- Additional control charting and process capability alternatives
- Applying IEE
- Exercises

**13. P-DMAIC – Measure Phase (Baseline Project): IEE Process Predictability and Process Capability/Performance metric assessment (Attribute response)**

- P-DMAIC roadmap component
- IEE process predictability and process capability/performance metric: Attribute pass/fail output
- Example 13.1: IEE process predictability and process capability/performance metric
- Attribute pass/fail output
- Example 13.2: IEE individuals control chart as an alternative to traditional P chart
- IEE process predictability and process capability/performance metric: Infrequent failures
- Example 13.3: IEE process predictability and process capability/performance metric – Infrequent failure output
- Example 13.4: IEE process predictability and process capability/performance metric – Rare spills
- Direction for improving an attribute response
- Example13.5: Value chain metric computation – 30,000-foot-level attribute assessment with Pareto chart
- Applying IEE
- Exercises

**14. P-DMAIC – Measure Phase (Lean Assessment)**

- P-DMAIC roadmap component
- Waste identification and prevention
- Principles of Lean
- Example 14.1: Takt time
- Little’s law
- Example 14.2 Little’s Law
- Identification of process improvement focus areas for projects
- Lean assessment
- Workflow analysis: Observation worksheet
- Workflow analysis: Standardized work chart
- Workflow analysis: Combination work table
- Workflow analysis: Logic flow diagram
- Workflow analysis: Spaghetti diagram or physical process flow
- Why-why or 5 whys diagram
- Time-value diagram
- Example 14.3: Development of a bowling ball
- Value stream mapping
- Value stream considerations
- Additional enterprise process lean tools, concepts and examples
- Applying IEE
- Exercises

**15. P-DMAIC – Measure Phase: Measurement Systems Analysis**

- IEE project execution roadmap
- Data integrity and background
- IEE application examples: MSA
- Initial MSA considerations
- Simple MSA assessment
- Variability sources in a 30,000-foot-level metric
- Three uses of measurement
- Terminology
- Gage R&R considerations
- Gage R&R relationships
- Preparation for a measurement system study
- Measurement systems improvement needs and possible improvement sources
- Example 15.1: Gage R&R
- Linearity
- Example 15.2 Linearity
- Attribute agreement analysis
- Example 15.3: Attribute agreement analysis
- Gage study of destructive testing
- Example 15.4: Gage study of destructive testing
- 5-step measurement improvement process
- Uncertainty due to data rounding
- Example 15.5: 5-step measurement improvement process
- Applying IEE
- Exercises

**16. P-DMAIC – Measure Phase (Wisdom of the Organization)**

- P-DMAIC roadmap component
- Flowcharting
- Process modeling and simulation
- Benchmarking
- Brainstorming
- Cause-and-effect diagram
- Cause-and effect matrix and analytical hierarchy process (AHP)
- Affinity diagram
- Nominal group technique (NGT)
- Force field analysis
- FMEA
- IEE application examples: FMEA
- FMEA implementation
- Development of a process FMEA
- Process FMEA tabular entries
- Generating a FMEA
- Exercises

**Part IV Improvement Project Roadmap: Analyze Phase**

** 17. P-DMAIC – Analyze Phase: Data Collection Plan (DCP) and Experimentation Traps**

- P-DMAIC roadmap component
- Solutions determination process
- Data collection plan (DCP) needs, source and types
- Data collection tools
- Sampling error sources
- Experimentation traps
- Example 17.1: Experimentation trap – Measurement error and other sources of variability
- Example 17.2: Experimentation trap – Lack of randomization
- Example 17.3: Experimentation trap – Confounded effects
- Example 17.4: Experimentation trap – Independently designing and conducting an experiment
- Sampling considerations
- Example 17.5: Continuous response data collection
- Example 17.6: Attribute response data collection;
- Exercises

**18. P-DMAIC – Analyze Phase: Visualization of Data**

- P-DMAIC roadmap component
- IEE application example: Visualization of data
- Box plot
- Example 18.1: Plots of injection-molding data – Box plot, marginal plot, main effects plot, and interaction plot
- Multi-vari charts
- Example 18.2: Multi-vari chart of injection-molding data
- Applying IEE
- Exercises

**19. Confidence Intervals and Hypothesis Tests**

- Sampling distributions
- Confidence interval statements
- Central limit theorem
- Hypothesis testing
- Example 19.1: Hypothesis testing
- Example 19.2 Probability plot hypothesis test
- Choosing alpha
- Nonparametric estimates: Runs test for randomization
- Example 19.3: Nonparametric runs test for randomization
- Applying IEE
- Exercises

**20. Inferences: Continuous Response**

- Summarizing sampled data
- Sample size: Hypothesis test of a mean criterion for continuous data response
- Example 20.1: Sample size determination for a mean criterion test
- Confidence intervals on the mean and hypothesis test criteria alternatives
- Example 20.2 Confidence intervals on the mean
- Example 20.3: Sample size – an alternative approach
- Standard deviation confidence interval
- Example 20.4: Standard deviation confidence statement
- Percentage of the population assessment
- Example 20.5: Percentage of the population statements
- Example 20.6: Base-lining a 30,000-foot-level continuous-response metric and determining process confidence interval statements
- Applying IEE
- Exercises

**21. Inferences: Attribute (Pass/fail) Response**

- Attribute response situations
- Sample size: Hypothesis test of an attribute criterion
- Example 21.1: Sample size – A hypothesis test of an attribute criterion
- Confidence intervals for attribute evaluations and alternative sample size considerations
- Reduced sample size testing for attribute situations
- Example 21.2: Reduced sample size testing – Attribute response situations
- Example 21.3: Sampling does not fix common-cause problems
- Example 21.4: Base-lining a 30,000-foot-level attribute-response metric and determining process confidence interval statement
- Attribute sample plan alternatives
- ASQ (Acceptable Quality Level) sampling can be deceptive
- Example 21.5: Acceptable quality level
- Applying IEE
- Exercises

**22. P-DMAIC – Analyze Phase: Continuous Response Comparison Tests**

- P-DMAIC roadmap component
- IEE application examples: Comparison tests
- Comparing continuous data responses
- Sample size: Comparing means
- Comparing two means
- Example 22.1: Comparing the means of two samples
- Comparing variances of two samples
- Example 22.2: Comparing the variance of two samples
- Comparing populations using a probability plot
- Example 22.3 Comparing responses using a probability plot
- Example 22.4: IEE demonstration of process improvement for a continuous response
- Paired comparison testing
- Example 22.5: Paired comparison testing for a new design
- Example 22.6: Paired comparison testing for improved gas mileage
- Comparing more than two samples
- Example 22.7: Comparison means to determine if process improved
- Applying IEE
- Exercises

**23. P-DMAIC – Analyze Phase: Comparison Tests for Attribute Pass/Fail Response**

- P-DMAIC roadmap component
- IEE application examples: Attribute comparison tests
- IEE application examples: Attribute comparison tests
- Comparing attribute data
- Sample size comparing proportions
- Comparing proportions
- Example 23.1: Comparing proportions
- Comparing nonconformance proportions and count frequencies
- Example 23.2: Comparing nonconformance proportions
- Example 23.3: Comparing counts
- Example 23.4: Difference in two proportions
- Example 23.5: IEE demonstration of process improvement for an attribute response
- Applying IEE
- Exercises

**24. P-DMAIC – Analyze Phase: Variance Components**

- P-DMAIC roadmap component
- IEE application examples: Variance components
- Description
- Example 24.1: Variance components of pigment paste
- Example 24.2: Variance components of a manufactured door including measurement system components
- Example 24.3: Determining process capability/performance metric using variance components
- Example 24.4: Variance components analysis of injection-molding data
- Example 24.5: Project analysis for variance components of an hourly response that had an unsatisfactory process capability/performance metric
- Applying IEE
- Exercises

**25. P-DMAIC – Analyze phase: Correlation and Simple Linear Regression**

- P-DMAIC roadmap component
- IEE application examples: Regression
- Scatter plot (dispersion graph) Correlation
- Example 25.1: Correlation
- Simple linear regression
- Analysis of residuals
- Analysis of residuals: Normality assessment
- Analysis of residuals: Time sequence
- Analysis of residuals: Fitted values
- Example 25.2: Simple linear regression
- Applying IEE
- Exercises

**26. P-DMAIC – Analyze Phase: Single-Factor (One-way) Analysis of Variance (ANOVA) and Analysis of Means (ANOM)**

- P-DMAIC roadmap component
- IEE application examples: ANOVA and ANOM
- Application steps
- Single-factor analysis of variance hypothesis test
- Single-factor analysis of variance table calculations
- Estimation of model parameters
- Unbalanced data
- Model adequacy
- Analysis of residuals: Fitted value plots and data normalizing transformations
- Comparing pairs of treatment means
- Example 26.1: Single-factor analysis of variance
- Analysis of means (ANOM)
- Example 26.2 Analysis of means
- Example 26.3: Analysis of means of injection-molding data
- General linear modeling (GLM)
- Nonparametric estimate: Kruskal-Wallis test
- Example 26.4: Nonparametric Kruskal-Wallis test
- Nonparametric estimate: Mood’s median test
- Example 26.5: Nonparametric Mood’s median test
- Other considerations
- Applying IEE
- Exercises

**27. P-DMAIC – Two-factor (Two-way) Analysis of Variance**

- P-DMAIC roadmap component
- Two-factor factorial design
- Example 27.1: Two-Factor factorial design
- Nonparametric estimate: Friedman test
- Example 27.2: Nonparametric Friedman test
- Applying IEE
- Exercises

**28. P-DMAIC – Analyze Phase: Multiple Regression, Logistic Regression, and Indictor Variables**

- P-DMAIC roadmap component
- IEE application examples: Multiple regression
- Description
- Example 28.1: Multiple regression
- Other considerations
- Example 28.2: Multiple regression best subset analysis
- Indicator variables (dummy variables) to analyze categorical data
- Example 28.3: Indicator variables
- Example 28.4: Indicator variables with covariate
- Binary logistic regression
- Example 28.5: Binary logistic regression for ingot preparation
- Example 28.6: Binary logistic regression for coating test
- Other logistic regression methods
- Exercises

**Part V Improvement Project Roadmap: Improve Phase**

** 29. Benefiting from Design of Experiments (DOE)**

- Terminology and benefits
- Example 29.1: Traditional experimentation
- Need for DOE
- Common excuses for not using DOE
- DOE application examples
- Exercises

**30. Understanding the Creation of Full and Fractional Factorial 2k DOEs**

- IEE application examples: DOE
- Conceptual explanation: Two-level full factorial experiments and two-factor interactions
- Conceptual explanation: saturated two-level DOE
- Example 30.1: Applying DOE techniques to a non-manufacturing process
- Exercises

**31. P-DMAIC – Improve Phase: Planning 2k DOEs**

- P-DMAIC roadmap component
- Initial thoughts when setting up a DOE
- Experiment design considerations
- Sample size considerations for a continuous response output DOE
- Experiment design considerations: Choosing factors and levels
- Experiment deign considerations: Choosing factors and levels
- Experiment design considerations: Factor statistical significance
- Experiment design considerations: Experiment resolutions
- Blocking and randomization
- Curvature check
- Applying IEE
- Exercises

**32. P-DMAIC – Improve Phase: Design and Analysis of 2k DOEs**

- P-DMAIC roadmap component
- Two-level DOE design alternatives
- Designing a two-level fractional experiment using Tables M and N
- Determine statistically-significant effects and probability plotting procedure
- Modeling equation format for a two-level DOE
- Example 32.1: A resolution V DOE
- DOE alternatives
- Example 32.2: A DOE development test
- Fold-over designs
- Applying IEE
- Exercises

**33. P-DMAIC – Improve Phase: Robust DOE**

- P-DMAIC roadmap component
- IEE application examples: Robust DOE
- Test strategies
- Loss function
- Example 33.1: Loss function
- Analyzing 2k residuals for sources of variability reduction
- Example 33.2: Analyzing 2k-residuals for sources of variability reduction
- Robust DOE strategy
- Example 33.3: Robust inner/outer array DOE to reduce scrap and downtime
- Applying IEE
- Exercises

**34. P-DMAIC – Improve Phase: Response Surface Methodology (RSM) and Evolutionary Operation (EVOP), and Path of Steepest Ascent**

- P-DMAIC roadmap component
- Modeling equations
- Central composite design
- Example 34.1: Response surface design
- Box-Behnken designs
- Additional response surface design considerations
- Evolutionary operations (EVOP)
- Example 34.2: EVOP
- Applying IEE
- Exercises

**35. P-DMAIC – Improve Phase: Innovation and Creativity**

- P-DMAIC roadmap component
- Alignment of creativity with IEE
- Creative problem solving
- Inventive thinking as a process
- TRIZ
- Six thinking hats
- Creative problem solving process (CPS)
- Exercises

**36. P-DMAIC – Improve Phase: Lean Tools and the PDCA Cycle**

- P-DMAIC roadmap component
- Learning by doing
- Plan-do-check-act (PDCA)
- Standard work and standard operating procedures
- One-piece flow
- Poka-yoke (Mistake proofing)
- Visual management
- 5S method
- Kaizen event
- Kanban
- Demand management
- Heijunka
- Continuous flow and cell design
- Changeover reduction
- Total productive maintenance (TPM)
- Applying IEE
- Exercises

**37. P-DMAIC – Improve Phase: Selecting, Implementing, and Demonstrating Project Improvements**

- P-DMAIC roadmap component
- Process modeling and simulation in the improve phase
- Solution selection and Pugh matrices
- Walking the new process and value chain documentation
- Pilot testing
- Process change implementation training and project validation
- Example 37.1: Sales quote process
- Example 37.2: Sales quote project
- Example 37.3: Sales personnel scorecard/dashboard and data analyses
- Exercises

**Part VI Improvement Project Roadmap: Control Phase**

** 38. P-DMAIC – Control Phase: Active Process Control**

- P-DMAIC roadmap component
- Process improvements and adjustments
- IEE application examples: Engineering process control
- Control of process input variables
- Realistic tolerances
- Exponentially weighted moving average (EWMA) and engineering process control (EPC)
- Pre-control charts
- Pre-control setup (Qualification procedure)
- Classical pre-control charts
- Two-stage pre-control chart
- Example 38.1: Engineering process control during store checkout
- Exercises

**39. P-DMAIC – Control Phase: Control Plan and Project Completion**

- P-DMAIC roadmap component
- Control plan: Is and is nots
- Controlling and error-proofing processes
- Control plan creation
- AIAG control plan: Entries
- Project completion
- Applying IEE
- P-DMAIC summary
- Exercises

**Part VII Appendix**

** Appendix A: Infrastructure**

- Roles and responsibilities
- Reward and recognition

**Appendix B: Six Sigma Metric and Article**

- Sigma quality level
- Article: Individuals control chart and data normality

**Appendix C: Creating Effective Presentations**

- Be in earnest
- Employ vocal variety
- Make it persuasive
- Inspire your audience

**Appendix D: P-DMAIC Execution Roadmap and Selected Drill Downs**

- P-DMAIC execution roadmap
- P-DMAIC execution roadmap drill down: In-process metrics decision tree
- P-DMAIC execution roadmap drill down: Baseline project
- P-DMAIC execution roadmap drill down: Visualization of data and hypothesis decision tree

**Appendix E: P-DMAIC Execution Tollgate Check Sheets
Appendix F: “Implementing Six Sigma” supplemental material
Appendix G: Reference Tables**

**List of Acronyms and Symbols**

**Glossary**

More information about the book, *Integrated Enterprise Excellence Volume III – Improvement Project Execution: A Management and Black Belt Guide for Going Beyond Lean Six Sigma and the Balanced Scorecard*, Forrest W. Breyfogle III, Bridgeway Books/Citius Publishing, Austin, TX, 2008.

- Description
- Book Brochure
- Video Description
- Purchase book at a discount from list prices

For additional information about Integrated Enterprise Excellence (IEE) see: **Business Management Implementation: IEE Articles, Videos, Books**

Contact Us to set up a time to discuss with Forrest Breyfogle how your organization might gain much from this lean Six Sigma body of knowledge book for a training guide and practitioner’s reference.