Contributed Session: Statistical Issues in Sampling
Estimating the Precision of On-Line Coal Analyzers
G. J. Lyman
Laboratory analyses which reveal the elemental composition of coal obtained by sampling from a continuous stream have long been used by suppliers and purchasers alike to monitor product quality and, hence, to determine a price for the coal. An attractive alternative, which is free from the influence of sampling bias and variability, is to use an on-line coal analyzer which is capable of interrogating an entire coal stream. In contemplating the purchase of such an instrument it is important to be able to assess its precision and to detect systematic differences (biases) between analyzer and laboratory results. We use likelihood methods to obtain a confidence interval for analyzer precision, to assess bias and to determine an appropriate sample size. The efficacy of the methods is studied via theory-based computation and via Monte Carlo methods. We also discuss the use of, and pitfalls inherent in, certain "operational" methods which have been proposed to evaluate the efficacy of on-line analyzers. The methodology is illustrated by application to a set of data obtained in an actual evaluation program. Our work builds upon earlier work of Grubbs (J. Amer. Statist. Assoc. 43 (1948):243--264), Hahn and Nelson (Technometrics 12 (1970):95--102) and Jaech (Technometrics 18 (1976):127--133).
[Fred Lombard, Dept. of Statistics, P.O Box 524, 2006 Auckland Park, SOUTH AFRICA; email@example.com ]
Defect Rate Estimation & Cost Minimization Using Acceptance Sampling with Rectification
Most of the work in acceptance sampling with rectification has been done under the restrictive assumption of the testing procedure being perfect. I apply acceptance sampling to the problem of quality assurance when the testing procedure is imperfect. The objective is to develop effective rectification sampling plans and estimators based on such plans. I develop estimators under two different sampling plans for the number of undetected defects remaining after a set of lots has been passed. I then compare the sampling plans on the basis of Mean Squared Error (MSE) and cost functions. One of the estimators proposed has an MSE at least one order of magnitude lower than that of existing estimators. Contrary to intuition, I also find that testing the sample three times, instead of once, usually reduces the overall cost but increases the MSE.
[Neerja Wadhwa, 60 Lawn Ave. #19, Stamford CT 06902 USA; firstname.lastname@example.org ]
Statistical Issues in Nondestructive Materials Evaluation & Related Problems of Sampling Inspection & Standardization
This paper describes methodology of data processing, as well as the results representation graphical tools, used in the National Standard of the USSR on nondestructive evaluation of refractory materials. The standard is based on utilizing the stochastic relationship between the resonance frequency of acoustic oscillations, excited in a part, and its mass, and characteristics of interest (COI) such as porosity, apparent density, compressive strength. The statistical part of the standard includes evaluation of the functional form of the relationship (testing for linearity), estimation of the parameters of the regression model, estimation of the accuracy of the calculated COI values, and periodic validation of the regression equation. Since indirect measurements based on stochastic relationships are inevitably subject to statistical errors, 100%-reliable classification of finished parts into good and defective becomes impossible, and sampling inspection methodology, used in MIL-STD 105, fails. A discussion of an alternative approach is given.
[George Zeliger, 1725 Commonwealth Ave., #2, Brighton, MA 02135 USA; email@example.com ]
Optimal Allocation for Estimating the Product of k (>= 3) Means
Andrew F. Seila
This paper considers the sample allocation problem for estimating the product of k (>= 3) means from k independent populations with unknown variances, under the constraint of fixed total sampling budget. A two stage sampling procedure is presented and is shown to be asymptotically efficient in the sense that the mean squared error for the two-stage estimator of the product approaches the theoretical lower bound when the total sampling budget becomes large. Appropriate initial sample sizes in the k populations are obtained to reach the optimal convergence rate. The results can be applied to general populations that satisfy certain mild conditions. Some limit theorems for the two stage estimator are also obtained. Simulation results for Bernoulli and Normal Populations are presented to examine the performance of this procedure for finite budgets and to demonstrate the theoretical results. The results we obtained have many applications in the industry.
[Shen Zheng, Dept. of Statistics, Univ. of Georgia, Athens, GA 30602 USA; firstname.lastname@example.org ]
Date created: 6/5/2001