(hi)(hi)(hi)(hi)
:gift::gift:
Statistical Treatment of Analytical Data
By
Zee B. Alfassi
Publisher
Wiley Blackwell
Number Of Pages: 272
Publication Date: 2004-11-24
ISBN-10 / ASIN: 0632053674
ISBN-13 / EAN: 9780632053674
Introduction
1.1 Statistics and quality assurance, control and assessment
The appraisal of quality has a considerable impact on analytical laboratories.
Laboratories have to manage the quality of their services and to convince clients
that the advocated level of quality is attained and maintained. Increasingly accreditation
is demanded or used as evidence of reliability. At present there are American
and European standards (ISO 25 and EN45001) that describe how a laboratory
ought to be organized in order to manage the quality of its results. These standards
form the basis for accreditation of analytical labs. Terms used frequently are quality
assurance and quality control. Quality assurance is a wider term which includes both
quality control and quality assessment.
Quality control of analytical data (QCAD) was defined by the ISO Committee
as: ‘The set of procedures undertaken by the laboratory for continuous monitoring
of operations and results in order to decide whether the results are reliable enough
to be released’. QCAD primarily monitors the batch-wise accuracy of results on
quality control materials, and precision on independent replicate analysis of ‘test
materials’. Quality assessment was defined (Taylor 1987) as ‘those procedures and
activities utilized to verify that the quality control system is operating within acceptable
limits and to evaluate the data’.
The standards of quality assurance (American ISO 25; European EN 45001) were
written for laboratories that do analyses of a routine nature and give criteria for the
implementation of a quality system which ensures an output with performance
characteristics stated by the laboratory. An important aspect of the quality assurance
system is the full documentation of the whole analysis process. It is essential to have
well designed and clear worksheets. On the worksheets both the raw data and the
calculated results of the analyses should be written. Proper worksheets reduce the
chances of computing error and enable reconstruction of the test if it appears that a
problem has occurred. The quality assurance system (or Standard) also treats the
problems of personnel, equipment, materials and chemicals. The most important
item is the methodology of the analysis. Quality control is not meaningful unless
the methodology used has been validated properly. Validation of a methodology
means the proof of suitability of this methodology to provide useful analytical data.
A method is validated when the performance characteristics of the method are
adequate and when it has been established that the measurement is under statistical
control and produces accurate results.
‘Statistical control’ is defined as ‘A phenomenon will be said to be ‘‘statistically
controlled’’ when, through the use of past experience, we can predict, at least
within limits, how the phenomenon may be expected to vary in the future.
Here it is understood that prediction means that we can state at least
approximately, the probability that the observed phenomenon will fall within the
given limits.’
The quality assurance systems required for accreditation of analytical laboratories
are very important and are dealt with in several recent books (Kateman &
Buydens 1987; Guennzler 1994; Funk et al. 1995; Pritchard 1997). However, these
systems are well beyond the scope of this book, which will be devoted mainly to
quality assessment of analytical data.
The quality of chemical analysis is usually evaluated on the basis of its uncertainty
compared to the requirements of the users of the analysis. If the analytical
results are consistent and have small uncertainty compared to the requirements, e.g.
minimum or maximum concentration of special elements in the sample and its
tolerances, the analytical data are considered to be of adequate quality. When the
results are excessively variable or the uncertainty is larger than the needs,
the analytical results are of low or inadequate quality. Thus, the evaluation of the
quality of analysis results is a relative determination. What is high quality for one
sample could be unacceptable for another. A quantitative measurement is always an
estimate of the real value of the measure and involves some level of uncertainty. The
limits of the uncertainty must be known within a stated probability, otherwise no
use can be made of the measurement. Measurement must be done in such a way that
could provide this statistical predictability.
Statistics is an integral part of quality assessment of analytical results, e.g. to
calculate the precision of the measurements and to find if two sets of measurements
are equivalent or not (in other words if two different methods give the same result
for one sample).
Precise and accurate, which are synonyms in everyday language, have distinctly
different meaning in analytical chemistry methodology. There are precise methods,
which means that repeated experiments give very close results which are inaccurate
since the measured value is not equal to the true value, due to systematic error in
the system. For example, the deuterium content of a H2O=D2O mixture used to be
determined by the addition of LiAlH4, which reduces the water to hydrogen gas.
The gas is transferred and measured by a mass spectrometer. However, it was
found that although the method is precise, it is inaccurate since there is an isotope
effect in the formation of the hydrogen.
Figure 1.1 explains simply the difference between precision and accuracy. Statistics
deals mainly with precision, while accuracy can be studied by comparison with
known standards. In this case, statistics play a role in analyzing whether the results
are the same or not.
Old books dealt with only statistical methods. However the trend in the last
decade is to include other mathematical methods that are used in analytical chemistry.
Many analytical chemists are using computer programs to compute analytically
areas of the various peaks in a spectrum or a chromatogram (in a spectrum the
intensity of the signal is plotted vs. the wavelength or the mass [in mass spectra],
while in the chromatogram it is plotted as a function of the time of the separation
process). Another example is the use of the Fourier Transform either in ‘Fourier
Transform Spectroscopy’ (mainly FTIR and FT-NMR, but recently also other
spectroscopies) or in smoothing of experimental curves. The combination of statistics.........................
Link
http://rapidshare.com/files/232893511/0632053674_Statistical_Treatment.rar
or
http://www.filefactory.com/file/agh36f1/n/0632053674_Statistical_Treatment_rar
:gift::gift:
Statistical Treatment of Analytical Data
By
Zee B. Alfassi
Publisher
Wiley Blackwell
Number Of Pages: 272
Publication Date: 2004-11-24
ISBN-10 / ASIN: 0632053674
ISBN-13 / EAN: 9780632053674
Introduction
1.1 Statistics and quality assurance, control and assessment
The appraisal of quality has a considerable impact on analytical laboratories.
Laboratories have to manage the quality of their services and to convince clients
that the advocated level of quality is attained and maintained. Increasingly accreditation
is demanded or used as evidence of reliability. At present there are American
and European standards (ISO 25 and EN45001) that describe how a laboratory
ought to be organized in order to manage the quality of its results. These standards
form the basis for accreditation of analytical labs. Terms used frequently are quality
assurance and quality control. Quality assurance is a wider term which includes both
quality control and quality assessment.
Quality control of analytical data (QCAD) was defined by the ISO Committee
as: ‘The set of procedures undertaken by the laboratory for continuous monitoring
of operations and results in order to decide whether the results are reliable enough
to be released’. QCAD primarily monitors the batch-wise accuracy of results on
quality control materials, and precision on independent replicate analysis of ‘test
materials’. Quality assessment was defined (Taylor 1987) as ‘those procedures and
activities utilized to verify that the quality control system is operating within acceptable
limits and to evaluate the data’.
The standards of quality assurance (American ISO 25; European EN 45001) were
written for laboratories that do analyses of a routine nature and give criteria for the
implementation of a quality system which ensures an output with performance
characteristics stated by the laboratory. An important aspect of the quality assurance
system is the full documentation of the whole analysis process. It is essential to have
well designed and clear worksheets. On the worksheets both the raw data and the
calculated results of the analyses should be written. Proper worksheets reduce the
chances of computing error and enable reconstruction of the test if it appears that a
problem has occurred. The quality assurance system (or Standard) also treats the
problems of personnel, equipment, materials and chemicals. The most important
item is the methodology of the analysis. Quality control is not meaningful unless
the methodology used has been validated properly. Validation of a methodology
means the proof of suitability of this methodology to provide useful analytical data.
A method is validated when the performance characteristics of the method are
adequate and when it has been established that the measurement is under statistical
control and produces accurate results.
‘Statistical control’ is defined as ‘A phenomenon will be said to be ‘‘statistically
controlled’’ when, through the use of past experience, we can predict, at least
within limits, how the phenomenon may be expected to vary in the future.
Here it is understood that prediction means that we can state at least
approximately, the probability that the observed phenomenon will fall within the
given limits.’
The quality assurance systems required for accreditation of analytical laboratories
are very important and are dealt with in several recent books (Kateman &
Buydens 1987; Guennzler 1994; Funk et al. 1995; Pritchard 1997). However, these
systems are well beyond the scope of this book, which will be devoted mainly to
quality assessment of analytical data.
The quality of chemical analysis is usually evaluated on the basis of its uncertainty
compared to the requirements of the users of the analysis. If the analytical
results are consistent and have small uncertainty compared to the requirements, e.g.
minimum or maximum concentration of special elements in the sample and its
tolerances, the analytical data are considered to be of adequate quality. When the
results are excessively variable or the uncertainty is larger than the needs,
the analytical results are of low or inadequate quality. Thus, the evaluation of the
quality of analysis results is a relative determination. What is high quality for one
sample could be unacceptable for another. A quantitative measurement is always an
estimate of the real value of the measure and involves some level of uncertainty. The
limits of the uncertainty must be known within a stated probability, otherwise no
use can be made of the measurement. Measurement must be done in such a way that
could provide this statistical predictability.
Statistics is an integral part of quality assessment of analytical results, e.g. to
calculate the precision of the measurements and to find if two sets of measurements
are equivalent or not (in other words if two different methods give the same result
for one sample).
Precise and accurate, which are synonyms in everyday language, have distinctly
different meaning in analytical chemistry methodology. There are precise methods,
which means that repeated experiments give very close results which are inaccurate
since the measured value is not equal to the true value, due to systematic error in
the system. For example, the deuterium content of a H2O=D2O mixture used to be
determined by the addition of LiAlH4, which reduces the water to hydrogen gas.
The gas is transferred and measured by a mass spectrometer. However, it was
found that although the method is precise, it is inaccurate since there is an isotope
effect in the formation of the hydrogen.
Figure 1.1 explains simply the difference between precision and accuracy. Statistics
deals mainly with precision, while accuracy can be studied by comparison with
known standards. In this case, statistics play a role in analyzing whether the results
are the same or not.
Old books dealt with only statistical methods. However the trend in the last
decade is to include other mathematical methods that are used in analytical chemistry.
Many analytical chemists are using computer programs to compute analytically
areas of the various peaks in a spectrum or a chromatogram (in a spectrum the
intensity of the signal is plotted vs. the wavelength or the mass [in mass spectra],
while in the chromatogram it is plotted as a function of the time of the separation
process). Another example is the use of the Fourier Transform either in ‘Fourier
Transform Spectroscopy’ (mainly FTIR and FT-NMR, but recently also other
spectroscopies) or in smoothing of experimental curves. The combination of statistics.........................
Link
http://rapidshare.com/files/232893511/0632053674_Statistical_Treatment.rar
or
http://www.filefactory.com/file/agh36f1/n/0632053674_Statistical_Treatment_rar