Home > Laboratory > Advanced Statistical Analysis of Laboratory Data

Advanced Statistical Analysis of Laboratory Data Book Course

Course Duration

5 Days

Course Description

The analytical laboratory has always served an important function by providing data in support of other branches of science and engineering and in helping control product quality or process variables. In recent years, however, the laboratory has in many cases come into its own as a semi-independent entity, geared to the solution of problems by means of the techniques available to it, rather than serving only to provide data for others to interpret. Whether these problems are solved independently or by co-operative effort is not important. What is important is that the broad capabilities of the laboratory be recognized. Where this is the case, the laboratory can exercise a unique function in developing information essential to the organization of which it is a part, and its staff will enjoy the prestige among its peers that it merits. Generally, the function of the industrial analytical laboratory is designated as process or product quality control, technical service, or research and development. Often, however, two or more of these functions are exercised in the same laboratory, sometimes by the designation of personnel for a specific type of assignment. It should be emphasized that through the majority of analytical laboratories may be those serving industry and can be placed in one of these four categories, many operations and problems are similar, regardless of the laboratory’s affiliation, and can be viewed from the same perspective.

The purpose of any analytical measurement is to get consistent, reliable and accurate data. There is no doubt that incorrect measurement results can lead to tremendous costs. In addition, reporting incorrect analytical results at any particular time leads to loss of a laboratory’s confidence in the validity of future results. Therefore, any laboratory should do its outmost to ensure measuring and reporting reliable and accurate data within a known level of confidence. Validation and qualification of processes and equipment will help meet this goal.

The appraisal of quality has a considerable impact on analytical laboratories. Laboratories have to manage the quality of their services and to convince clients that the advocated level of quality is attained and maintained. Increasingly, accreditation is demanded or used as evidence of reliability. Quality control is not meaningful unless the methodology used has been validated properly. Validation of a methodology means the proof of suitability of this methodology to provide useful analytical data. A method is validated when the performance characteristics of the method are adequate and when it has been established that the measurement is under statistical control and produces accurate results.

Despite the fact that a laboratory may have met all qualification and accreditation requirements, its reported data are still subject to verification and challenge. The quality of chemical analysis is usually evaluated on the basis of its uncertainty compared to the requirements of the users of the analysis. If the analytical results are consistent and have small uncertainty compared to the requirements, the analytical data are considered to be of adequate quality. When the results are excessively variable or the uncertainty is larger than the needs, the analytical results are of low or inadequate quality. Thus, the evaluation of the quality of analysis results is a relative determination. What is high quality for one sample could be unacceptable for another. A quantitative measurement is always an estimate of the real value of the measure and involves some level of uncertainty. The limits of the uncertainty must be known within a stated probability, otherwise no use can be made of the measurement. Measurement must be done in such a way that could provide this statistical predictability.

Statistics is an integral part of quality assessment of analytical results. The concept of a frequency distribution, which embodies the behaviour of change fluctuations, is a felicitous one for the description of many pertinent aspects of measurement. If this concept is combined with the principle of least squares, by which the inconsistencies of measurements are compensated, and with the modern ideas underlying “inverse probability,” which allow us to make quantitative statements about the causes of observed chance events, we obtain an impressive body of useful knowledge. Nevertheless, it is by no mean certain that a systematic science of data analysis, if and when it finally be developed, will be based exclusively on probabilistic concepts. Undoubtedly probability will always play an important role in data analysis but it is rather likely that principles of a different nature will also be invoked in the final formation of such a science. In the meantime we must make use of whatever methods are available to us for a meaningful approach to the analysis of experimental data.

This seminar is designed to provide participants with good knowledge and skills required to perform advanced statistical calculations in modern analytical laboratories. The seminar starts by reviewing the existing knowledge of participants on the fundamental concepts of statistics. Method development and validation will then be discussed which also include the quality requirements as per the ISO 17025 standard. Participants will then be introduced to the process of measuring uncertainty estimation by identifying uncertainty sources, quantifying and reporting combined uncertainty. The seminar will then discuss the various calibration functions and the types of statistical quality control charts (SQC) and wrap up with the procedures and methods used to interpret the inter & intra laboratory data. Participants will have the opportunity to apply the principles learned to actual problems through the use of illustrative case studies under the guidance of the instructor. Through a combination of lectures and problem-solving sessions, participants will learn advanced statistical techniques that they can put to immediate use In their laboratory.

Course Objective

Upon the successful completion of the seminar, participants will be able to:

  • Review statistical formulas used in QC/QA and illustrate method development & validation
  • Identify the proper procedure for analytical measurement & uncertainty including its uncertainty sources, error and uncertainty, method validation and traceability
  • Explain the uncertainty evaluation procedure for Quantifying Uncertainty (GUM), and use prior collaborative method development and validation study data
  • Calculate the combined uncertainty and analyze the results based on standard and expanded uncertainty reports
  • Explain the calibration functions which include the establishment of an analytical range, determination of the calibration function, verification of linearity & precision and recovery
  • Enumerate the types of Statistical Quality Control Charts (SQC) and interpret inter & intra laboratory data

Course Certificate

UMETTS certificate will be issued to all attendees completing all of the total tuition hours of the course.

Who Should attend?

This seminar is aimed at all degree-holder staff of Analytical Laboratories. R&D and government statutory employees are encouraged to attend this outstanding seminar. Further, this seminar is very important for QA/QC employees and Third-Party Inspection and certification companies.

As a pre-requisite to attend this advanced seminar, participants shall have enough knowledge and skills in basic statistics within analytical laboratory.

Course Outline

Key Topics You Will Learn About

  • How to understand the strengths and weaknesses of data
  • How to recognize and reduce different types of errors
  • Ways to carry out significance tests
  • How to correctly use outlier tests and when not to use them
  • Ways of defining the limits of detection, determination, and quantification
  • How to know what statistical test to use when
  • How to understand the influence of sample size on statistical significance and power
  • Why pooling variances gives stability to analytical results
  • How to set in-house specifications
  • How to apply statistical process control charts to measurement processes