Current Research

Discrete Modeling Problems in Statistical Process Control

This research encompasses ongoing methods development useful in several fairly common applications, including rare event data, small sample sizes, aggregated and mixed distributions. Recent examples include the development and investigation of geometric-based statistical process control methods, the use of alternate discrete probability models, economic chart optimization, the impact of data aggregation on performance of attribute SPC methods, risk-adjusted methods for non-homogeneous events, and the development of research-code for start-up control charts. Related research has developed theoretic probability models using compound, cluster, and mixed distributions for various physical phenomena encountered in a variety of applications, such as for semiconductor integrated circuits and non-homogeneous surgery patients. Several pieces of research-level code have been developed to facilitate the construction and investigation of these methods. Web-based Java programs also have been developed to investigate the optimal g chart design.

Statistical Quality Control Methods for Health Systems Problems (NSF)

This research is developing and evaluating new statistical quality control methods for healthcare adverse events, such as medication and laboratory errors, hospital-acquired infections, needle stick injuries, and others. New methods are being developed to monitor rare events based on inverse binomial sampling and on mixtures of non-identical distributions, to incorporate logistic regression and other approaches into a general risk-adjustment framework, and to account for aggregation of non-homogeneous patients. Numeric programs and an optimization search algorithm are being developed to investigate the statistical performance of these methods and to determine their optimal economic and statistical-economic designs. Conditions are being identified under which these results perform better than traditional surveillance methods and to develop design and selection guidelines. The benefits of this research will lead to improved surveillance methods to control and reduce the occurrence of preventable adverse healthcare events, estimated to result in 770,000 to 2 million patient injuries, 45,000 to 98,000 deaths, and $8.8 billion in costs annually nationwide. The developed methods are being validated empirically using large databases at several academic hospitals in the U.S. and abroad. The results also will benefit similar quality control problems in high yield manufacturing and other industries.

Geometric-Based Statistical Quality Control Methods

This research is examining quality control approaches for low defect processes such as high yield manufacturing systems and adverse healthcare events, with particular emphasis on events-between g and h control charts based on inverse sampling from geometric and negative binomial distributions. Several interesting properties and design modifications of these new charts have been illustrated to significantly improve statistical operating characteristics over conventional methods, particularly for infrequent events and low defect rates. Research code has been developed to analyze chart performance, determine the optimal chart design by several criteria, and compare results with other approaches. Important healthcare applications include hospital-acquired infections, heart surgery complications, catheter-related infections, surgical-site infections, contaminated needle sticks, and other iatrically induced outcomes.

Design of Experiments (DOE)

Recent work in this area includes investigation of the statistical performance of alternate methods for detecting variance effects in replicated factorial industrial experiments and mathematical programming approaches to simultaneous optimization of response means and variances in designed experiments. Results have been applied in various projects in the manufacture of printed circuit boards, powder metals, and semiconductor fabrication.

Multi-Response Optimization

Ongoing work on methods to optimize a process with multiple responses, particularly in the context of designed experiments, including non-linear mathematical programming, desirability function approaches, and multi-variate loss functions. This work has led to the development of several new loss probability functions that are then used in optimization routines and the derivation of the variances and higher moments of standard quadratic loss functions.

Mathematical Models of Powdered Metal Production Processes

Powder metallurgy (P/M) part production is becoming a common manufacturing process used for fabricating near net shape parts that require precise control over their microstructures. Rather than liquefying and casting the raw material, P/M processes sinter powdered metal into the desired parts, offering an alternative to casting for titanium and other metal alloys that are abrasive in their melted state. This research involves developing a data acquisition system for a metallic powder making process and performing statistical analyses of the effect of various production parameters on yield and quality. These results can be used to control production, statistically describe P/M production and sieving processes, and optimize yield. Initial design of experiments studies have developed first and second order polynomial prediction models of particle size mean and standard deviation under different operating conditions (primarily rotational speed and amperage). While simple linear models have been fairly accurate tests for curvature indicate modest improvements are possible via higher order models.

Employee and Patient Safety in High Hazard Areas

Currently conducting research in the application of quality engineering and frontier estimation methods to the study of employee and patient safety. SPC methods are being developed for safety applications in which the occurrence of rare events is not from a single homogeneous process. Data envelopment and stochastic frontier estimation methods are being used to study the production of employee and patient safety in a cross-section of high-tech and healthcare industries.

Inspection Error

Visual inspection error research in a variety of industries including manufacturing, administrative, finance, healthcare, and others. This work has involved the development and application of probability models, economic models, and optimization methods to maximize the quality of outgoing product at the minimal overall cost. Results include statistical and economic models of acceptance sampling plans, the use of multiple 100% (imperfect) inspections, and the derivation of corresponding minimal cost models. Extensions to similar problems that arise in healthcare include cervical and breast cancer screening errors, laboratory diagnostic errors, medication errors, and others.

Cancer and Clinical Laboratory Screening Models

This research is concerned with the overall sensitivity, specificity, and cost of laboratory processes for screening Pap smears for early indications of cervical cancer or its precursors, with particular focus on the policy required by the Clinical Laboratory Improvement Amendments Act (CLIA). Mathematical and economic models have been developed that prove CLIA is never optimal by any criteria and always increases total costs, overall sensitivity under CLIA never can be improved beyond certain mathematical bounds, no amount of partial screening ever is optimal, and multiple evaluations of each smear in some cases is optimal. The proposed use of automated rescreening technology as recently approved by the FDA also can dramatically increase overall costs without significantly improving sensitivity, despite widespread marketing by manufacturers to the contrary.

Frontier Estimation Analysis

Existent and new frontier estimation methods are being used in two separate studies to examine the production efficiency of healthcare quality and the production of safety, including the use of data envelopment analysis and stochastic frontier estimation. This work includes the analysis of several large-scale US healthcare data sets, the introduction of mathematical statistics into data envelopment analysis methods, and a comparison of existent and new methods for determining appropriate stratifications of all decision making units into peer performance sets.