scispace - formally typeset
Journal ArticleDOI

An Introduction to Error Analysis: The Study of Uncertainties in Physical Measurements

E Richard Cohen
- 01 Jun 1998 - 
- Vol. 9, Iss: 6, pp 022
Reads0
Chats0
TLDR
The first edition of this book as mentioned in this paper was published in 1992 and was used for the first year of a physics course at the University of Sheffield. But it was not intended to be a statistics text, nor was it intended to serve as a statistic text, but an introdution to the mathematics required for the analysis of measurements at the level of a first year laboratory course.
Abstract
Students in a science or engineering curriculum ought to be introduced early to the requirement that a meaningful measurement result should always be accompanied by a statement of its uncertainty. This book has been written specifically with this objective in mind. That the first edition has been successful in doing this is attested to by its popularity with both faculty and students, and its translation into six languages. This book is not a statistics text - nor was it intended to be - but an introdution to the mathematics required for the analysis of measurements at the level of a first-year laboratory course. Part 1 begins with uncertainty as a qualitative concept and builds slowly, using many numerical examples and exercises for the student, to develop methods for quantifying uncertainty, and ultimately relating it to the standard deviation of a statistical distribution. Along the way, Taylor develops the rules for expressing and combining (`propagating') uncertainties, and introduces the student to the gaussian (normal) distribution and some of its properties. Part 2 covers, with somewhat more mathematical rigor, specific topics such as data rejection criteria, the binomial and Poisson distributions, covariance and correlation, least-squares fitting, and the chi-squared test. I was not familiar with the first edition, and from a quick scan of the Preface I looked forward to reading this book and learning something about the state of statistical analysis in first-year university texts today. I was disappointed (in part with what the level of the book implies about the sad state of preparation of today's students). Although there are now two ISO publications ( International Vocabulary of Basic and General Terms in Metrology (VIM) and Guide to the Expression of Uncertainty in Measurement (GUM), Geneva, 1993), Taylor makes no mention of either, and never gives a formal definition of `uncertainty' (although he ultimately associates `random uncertainty' with the standard deviation of a gaussian distribution). The book also does not clearly define `error', or the distinction between error and uncertainty. The important point, that the `propagation of uncertainty' is additive in terms of variances is valid for any distributions with finite variance, is not emphasized; instead Taylor restricts the discussion solely to the normal distribution and or those that can be approximated by it. I also find it unfortunate that the book does not clearly distinguish between the variance of a sample , the variance of a distribution , and the sample estimate of the variance of the distribution ( or ). Instead, he accepts the fact that formulas for the variance with either N or N - 1 dividing the sum of the squares of the deviations from the mean exist in the literature and concludes simply: `Nevertheless, you need to be aware of both definitions. In the physics laboratory, using the more conservative... def- inition... is almost always best.' In spite of these shortcomings, the book is a significant contribution to a student laboratory reading list, and it is written at a level that facilitates a self-study program. It has an important message to deliver and it appears to be delivering it well.

read more

Citations
More filters
Journal Article

Processing of gene expression data generated by quantitative real-time RT-PCR.

TL;DR: The Q-Gene software application is a tool to cope with complex quantitative real-time PCR experiments at a high-throughput scale and considerably expedites and rationalizes the experimental setup, data analysis, and data management while ensuring highest reproducibility.
Journal ArticleDOI

Accounting for uncertainty in DEMs from repeat topographic surveys: improved sediment budgets

TL;DR: In this paper, the authors present an accounting for uncertainty in DEMs from repeat topographic surveys: improved sediment budgets, which can be used to improve the quality of topographic data.
Journal ArticleDOI

Gas-Phase Databases for Quantitative Infrared Spectroscopy

TL;DR: The National Institute of Standards and Technology and the Pacific Northwest National Laboratory are each creating quantitative databases containing the vapor-phase infrared spectra of pure chemicals, and the two databases include different classes of compounds and were compared using 12 samples.
Journal ArticleDOI

The Correlation Coefficient: An Overview

TL;DR: This paper discusses the uses of the correlation coefficient r, either as a way to infer correlation, or to test linearity, and recommends the use of z Fisher transformation instead of r values because r is not normally distributed but z is (at least in approximation).