Certified Quality Engineers

$946.00

Category:

Overview


Overview

ASQ CQE: These Certified Quality Engineers are professionally trained in quality engineering and quality control. Trained in researching and preventing unnecessary costs through lack of quality, lost production costs, lost market share due to poor quality, etc. The Certified Quality Engineer is a professional who understands the principles of product and service quality evaluation and control.

Advance your career and attain the credential of ASQ Certified Quality Engineer. This 3-days (24-hour) course includes classroom instruction along with group discussion, as well as ASQ Primer training materials to prepare you for the ASQ test. Quality personnel who are in preparation for the ASQ Certified Quality Engineer (CQE) exam or want to further their professional development. Also valuable to all Quality Engineers, Production Engineers, Maintenance Engineers, Quality Auditors, QA/QC Engineers and who are interested in a refresher course.

Batch


Batch

  • Friday batch for 3 days
  • Saturday batch for 3 days(Call us for more details)

Examination


Examination

Each certification candidate is required to pass a written examination that consists of multiple-choice questions that measure comprehension of the Body of Knowledge. The Quality Engineer examination is a one-part, 160-question, five-hour exam. It is offered in English.

Prerequisite


Prerequisite

  • You must have eight years of on-thejob
  • experience in one or more of the
  • areas of the Certified Quality Engineer
  • Body of Knowledge.

 

If you have completed a degree* from a college, university, or technical school  with accreditation accepted by ASQ,part of the eight-year experience  requirement will be waived, as follows (only one of these waivers may be claimed):

• Diploma from a technical or trade school—one year will be waived.

• Associate degree—two years waived.

• Bachelor’s degree—four years waived.

• Master’s or doctorate—five years waived.

Program Outline


Program Outline

 

I Management and Leadership (15 Questions)

A. Quality Philosophies and Foundations

Explain how modern quality has evolved from quality control through statistical process control (SPC) to total quality management and leadership principles (including Deming’s 14 Points), and how

quality has helped form various continuous improvement tools including lean, six sigma, theory of constraints, etc. (Remember)

B. The Quality ManagementSystem (QMS)

1. Strategic planning

Identify and define top management’s responsibility for the QMS, including establishing policies and objectives, setting organization-wide goals, supporting quality initiatives, etc. (Apply)

2. Deployment techniques

Define, describe, and use various deployment tools in support of the QMS: benchmarking, stakeholder identification  and analysis, performance measurement tools, and project management tools such as PERT charts, Gantt charts, critical path method (CPM), resource allocation, etc. (Apply)

3. Quality information system (QIS)

Identify and define the basic elements of a QIS, including who will contribute data, the kind of data to be managed, who will have access to the data, the level of flexibility for future information needs, data analysis, etc.

(Remember)

C. ASQ Code of Ethics for

Professional Conduct

Determine appropriate behaviour in situations requiring ethical decisions. (Evaluate)

D. Leadership Principles and Techniques

Describe and apply various principles and techniques for developing and organizing teams and leading quality initiatives.(Analyze)

E. Facilitation Principles and Techniques

Define and describe the facilitator’s role and responsibilities on a team. Define and apply various

tools used with teams, including brainstorming, nominal group technique, conflict resolution,

force-field analysis, etc. (Analyze)

F. Communication Skills

Describe and distinguish between various communication methods for delivering information and

messages in a variety of situations across all levels of the organization. (Analyze)

G. Customer Relations

Define, apply, and analyze the results of customer relation measures such as quality function

deployment (QFD), customer satisfaction surveys, etc. (Analyze)

H. Supplier Management

Define, select, and apply various techniques including supplier qualification, certification, evaluation, ratings, performance improvement, etc. (Analyze)

I. Barriers to Quality Improvement

Identify barriers to quality improvement, their causes and impact, and describe methods for overcoming them. (Analyze)

II The Quality System (15 Questions)

A. Elements of the Quality System

Define, describe, and interpret the basic elements of a quality system, including planning, control, and improvement, from product and process design through quality cost systems, audit programs, etc.

(Evaluate)

B. Documentation of the Quality System

Identify and apply quality system documentation components, including quality policies, procedures to support the system, configuration management and document control to manage work instructions, quality records, etc.

(Apply)

C. Quality Standards and Other Guidelines

Define and distinguish between national and international standards and other requirements and

guidelines, including the Malcolm Baldrige National Quality Award(MBNQA), and describe key points

of the ISO 9000 series of standards and how they are used. [Note: Industry-specific standards will not be tested.] (Apply)

 

 

 

D. Quality Audits

1. Types of audits

Describe and distinguish between various types of quality audits such as product, process, management (system), registration (certification), compliance (regulatory), first, second, and third party, etc. (Apply)

2. Roles and responsibilities in audits

Identify and define roles and responsibilities for audit participants such as audit team (leader and members), client, auditee, etc. (Understand)

3. Audit planning and implementation

Describe and apply the steps of a quality audit, from the audit planning stage through conducting the audit, from the perspective of an audit team member. (Apply)

4. Audit reporting and follow-up

Identify, describe, and apply the steps of audit reporting and follow-up, including the need to

verify corrective action. (Apply)

E. Cost of Quality (COQ)

Identify and apply COQ concepts, including cost categories, data collection methods and

classification, and reporting and interpreting results. (Analyze)

F. Quality Training

Identify and define key elements of a training program, including conducting a needs analysis, developing curricula and materials, and determining the program’s effectiveness. (Apply)

III Product and Process Design (25 Questions)

A. Classification of Quality Characteristics

Define, interpret, and classify quality characteristics for new products and processes. [Note: The classification of product defects is covered in IV.B.3.] (Evaluate)

B. Design Inputs and Review

Identify sources of design inputs such as customer needs, regulatory requirements, etc., and how they translate into design concepts such as robust design, QFD, and Design for X (DFX, where X can mean six sigma (DFSS), manufacturability (DFM), cost (DFC), etc.). Identify and apply common elements of the design review process, including roles and responsibilities of participants. (Analyze)

C. Technical Drawings and Specifications

Interpret technical drawings including characteristics such as views, title blocks, dimensioning, tolerancing, GD&T symbols, etc. Interpret specification requirements in relation to product and process characteristics. (Evaluate)

D. Design Verification

Identify and apply various evaluations and tests to qualify and validate the design of new products

and processes to ensure their fitness for use. (Evaluate)

E. Reliability and Maintainability

1. Predictive and preventive maintenance tools

Describe and apply these tools and techniques to maintain and improve process and product

reliability. (Analyze)

2. Reliability and maintainability indices

Review and analyze indices such as, MTTF, MTBF, MTTR, availability, failure rate, etc.

(Analyze)

3. Bathtub curve

Identify, define, and distinguish between the basic elements of the bathtub curve. (Analyze)

4. Reliability/safety/hazard assessment tools

Define, construct, and interpret the results of failure mode and effects analysis (FMEA), failure

mode, effects, and criticality analysis (FMECA), and fault tree analysis (FTA). (Analyze)

IV IProject Management  (32 Questions)

A. Tools

Define, identify, and apply product and process control methods such as developing control plans,

identifying critical control points, developing and validating work instructions, etc. (Analyze)

B. Material Control

1. Material identification, status, and traceability

Define and distinguish these concepts, and describe methods for applying them in various situations. [Note: Product recall procedures will not be tested.] (Analyze)

2. Material segregation

Describe material segregation and its importance, and evaluate appropriate methods for applying

it in various situations. (Evaluate)

3. Classification of defects

Define, describe, and classify the seriousness of product and process defects. (Evaluate)

4. Material review board (MRB)

Identify the purpose and function of an MRB, and make appropriate disposition decisions in various

situations. (Analyze)

C. Acceptance Sampling

1. Sampling concepts

Define, describe, and apply the concepts of producer and consumer risk and related terms, including operating characteristic (OC) curves, acceptable quality limit (AQL), lot tolerance percent defective (LTPD), average outgoing quality (AOQ), average outgoing quality limit (AOQL), etc. (Analyze)

2. Sampling standards and plans

Interpret and apply ANSI/ ASQ Z1.4 and Z1.9 standards for attributes and variables sampling. Identify and distinguish between single, double, multiple, sequential, and continuous sampling methods. Identify the characteristics of Dodge-Romig  sampling tables and when they should be used. (Analyze)

3. Sample integrity

Identify the techniques for establishing and maintaining sample integrity. (Analyze)

D. Measurement and Test

1. Measurement tools

Select and describe appropriate uses of inspection tools such as gage blocks, calipers, micrometers, optical comparators, etc. (Analyze)

2. Destructive and nondestructive tests

Distinguish between destructive and nondestructive measurement test methods and apply them

appropriately. (Analyze)

E. Metrology

Identify, describe, and apply metrology techniques such as calibration systems, traceability to

calibration standards, measurement error and its sources, and control and maintenance of measurement standards and devices. (Analyze)

F. Measurement System Analysis (MSA)

Calculate, analyze, and interpret repeatability and reproducibility (Gage R&R) studies, measurement

correlation, capability, bias, linearity, etc., including both conventional and control chart methods. (Evaluate)

V Continuous Improvement (30 Questions)

A. Quality Control Tools

Select, construct, apply, and interpret tools such as 1) flowcharts, 2) Pareto charts, 3) cause and effect diagrams, 4) control charts, 5) check sheets, 6) scatter diagrams, and 7) histograms. (Analyze)

B. Quality Management and

Planning Tools

Select, construct, apply, and interpret tools such as 1) affinity diagrams, 2) tree diagrams, 3) process decision program charts (PDPC), 4) matrix diagrams, 5) interrelationship digraphs, 6) prioritization matrices, and 7) activity network diagrams. (Analyze)

C. Continuous Improvement Techniques

Define, describe, and distinguish between various continuous improvement models: total quality

management (TQM), kaizen, Plan- Do-Check-Act (PDCA), six sigma, theory of constraints (TOC), lean,

etc. (Analyze)

D. Corrective Action

Identify, describe, and apply elements of the corrective action process including problem identification, failure analysis, root cause analysis, problem correction, recurrence control, verification of effectiveness, etc. (Evaluate)

E. Preventive Action

Identify, describe, and apply various preventive action tools such as errorproofing/ poka-yoke, robust design, etc., and analyze their effectiveness. (Evaluate)

VI Quantitative Methods and Tools (43 Questions)

A. Collecting and Summarizing Data

1. Types of data

Define, classify, and compare discrete (attributes) and continuous (variables) data. (Apply)

2. Measurement scales

Define, describe, and use nominal, ordinal, interval, and ratio scales. (Apply)

3. Data collection methods

Describe various methods for collecting data, including tally or check sheets, data coding, automatic gaging, etc., and identify their strengths and weaknesses. (Apply)

4. Data accuracy

Describe the characteristics or properties of data (e.g., source/ resource issues, flexibility,

versatility, etc.) and various types of data errors or poor quality such as low accuracy, inconsistency,

interpretation of data values, and redundancy. Identify factors that can influence data accuracy,

and apply techniques for error detection and correction. (Apply)

5. Descriptive statistics

Describe, calculate, and interpret measures of central tendency and dispersion (central limit theorem), and construct and interpret frequency distributions including simple, categorical, grouped,ungrouped, and cumulative.(Evaluate)

6. Graphical methods for depicting relationships

Construct, apply, and interpret diagrams and charts such as stem-and-leaf plots, box-and whisker

plots, etc. [Note: Run charts and scatter diagrams are covered in V.A.] (Analyze)

7. Graphical methods for depicting distributions

Construct, apply, and interpret diagrams such as normal probability plots, Weibull plots, etc. [Note: Histograms are covered in V.A.] (Analyze)

B. Quantitative Concepts

1. Terminology

Define and apply quantitative terms, including population, parameter, sample, statistic, random sampling, expected value, etc. (Analyze)

2. Drawing statistical conclusions

Distinguish between numeric and analytical studies. Assess the validity of statistical conclusions

by analyzing the assumptions used and the robustness of the technique used. (Evaluate)

3. Probability terms and concepts

Describe and apply concepts such as independence, mutually exclusive, multiplication rules,

complementary probability, joint occurrence of events, etc. (Apply)

C. Probability Distributions

1. Continuous distributions

Define and distinguish between these distributions: normal, uniform, bivariate normal, exponential, lognormal, Weibull, chi square, Student’s t, F, etc. (Analyze)

2. Discrete distributions

Define and distinguish between these distributions: binomial, Poisson, hypergeometric, multinomial, etc. (Analyze)

D. Statistical Decision-making

1. Point estimates and confidence intervals

Define, describe, and assess the efficiency and bias of estimators. Calculate and interpret standard

error, tolerance intervals, and confidence intervals. (Evaluate)

2. Hypothesis testing

Define, interpret, and apply hypothesis tests for means, variances, and proportions. Apply

and interpret the concepts of significance level, power, type I and type II errors. Define and

distinguish between statistical and practical significance. (Evaluate)

3. Paired-comparison tests

Define and use paired comparison (parametric) hypothesis tests, and interpret the results. (Apply)

4. Goodness-of-fit tests

Define and use chi square and other goodness-of-fit tests, and interpret the results. (Apply)

5. Analysis of variance (ANOVA)

Define and use ANOVAs and interpret the results. (Analyze)

6. Contingency tables

Define, construct, and use contingency tables to evaluate statistical significance. (Analyze)

E. Relationships Between Variables

1. Linear regression

Calculate the regression equation for simple regressions and least squares estimates. Construct

and interpret hypothesis tests for regression statistics. Use regression models for estimation

and prediction, and analyse  the uncertainty in the estimate. [Note: Nonlinear models and

parameters will not be tested.](Analyze)

2. Simple linear correlation

Calculate the correlation coefficient and its confidence interval, and construct and interpret a hypothesis test for correlation statistics. [Note: Serial correlation will not be tested.] (Analyze)

3. Time-series analysis

Define, describe, and use timeseries analysis including moving average, and interpret time-series

graphs to identify trends and seasonal or cyclical variation. (Analyze)

F. Statistical Process Control (SPC)

1. Objectives and benefits

Identify and explain objectives and benefits of SPC such as assessing process performance.

(Understand)

2. Common and special causes

Describe, identify, and distinguish between these types of causes. (Analyze)

3. Selection of variable

Identify and select characteristics for monitoring by control chart. (Analyze)

4. Rational subgrouping

Define and apply the principles of rational subgrouping. (Apply)

5. Control charts

Identify, select, construct, and use various control charts, including X— -R, X— -s, individuals and moving range (ImR or XmR), moving average and moving range (MamR), p, np, c, u, and CUSUM

charts. (Analyze)

6. Control chart analysis

Read and interpret control charts, use rules for determining statistical control. (Evaluate)

7. PRE-control charts

Define and describe how these charts differ from other control charts and how they should be

used. (Apply)

8. Short-run SPC

Identify, define, and use short-run SPC rules. (Apply)

G. Process and Performance Capability

1. Process capability studies

Define, describe, calculate, and use process capability studies, including identifying characteristics, specifications, and tolerances, developing sampling plans for such studies, establishing statistical control, etc. (Analyze)

2. Process performance vs. specifications

Distinguish between natura process limits and specification limits, and calculate percent defective. (Analyze)

3. Process capability indices

Define, select, and calculate Cp, Cpk, Cpm, and Cr, and evaluate process capability. (Evaluate)

4. Process performance indices

Define, select, and calculate Pp and Ppk and evaluate process performance. (Evaluate)

H. Design and Analysis of Experiments

1. Terminology

Define terms such as dependent and independent variables, factors, levels, response, treatment, error, and replication. (Understand)

2. Planning and organizing experiments

Define, describe, and apply the basic elements of designed experiments, including determining the experiment objective, selecting factors, responses, and measurement methods, choosing the

appropriate design, etc. (Analyze)

3. Design principles

Define and apply the principles of power and sample size, balance, replication, order, efficiency,

randomization, blocking, interaction, and confounding. (Apply)

4. One-factor experiments

Construct one-factor experiments such as completely randomized, randomized block, and Latin

square designs, and use computational and graphical methods to analyze the significance of results. (Analyze)

5. Full-factorial experiments

Construct full-factorial designs and use computational and graphical methods to analyse the significance of results.(Analyze)

6. Two-level fractional factorial experiments

Construct two-level fractional factorial designs (including Taguchi designs) and apply computational and graphical methods to analyze the significance of results. (Analyze)

 

 

 

 

 

 

 

 

Reviews

There are no reviews yet.

Be the first to review “Certified Quality Engineers”