Ris\ Estimate with Attendant uncertainty

Figure 3.1. Structure of model for microbial risk (Marks et al., 1998; with permission from Risk Analysis).

element framework of the NRC (1983). This framework is sufficiently general to be useful for chemical and microbial hazards (Fig. 3.1; Marks, 1998). As discussed above, the output of a risk assessment is the estimation of the probability and severity of adverse outcomes for given scenarios, according to our modification of Kaplan's definition of risk.

Hazard Identification (HI) is the first element of the 1983 NRC framework that describes the nature of the problem and the agents that cause adverse effects in a given scenario. Many types of adverse outcomes or "end points" can be considered. Toxicological or epidemiological studies are used to demonstrate an association of the hazard in food or water with human health risk. However, the identification of a hazard may be controversial, especially for chemical risk assessments that depend on extrapolation from animal studies and may consider only a single chronic "end point."

Exposure Assessment (EA) is the second element that focuses on modeling the occurrence and level of hazards, and the potential ingestion of the hazards in the food, which cause or contribute to adverse outcomes. An EA would typically include an assessment of a hazard in a particular food for given scenarios that describe the production, processing, distribution, and preparation of the food. In addition, the EA must assess the eating habits of the target populations. This assessment is often accomplished by examining consumption surveys or large databases of surveys such as the USDA Continuing Survey of

Food Intake by Individuals. Often, however, there is difficulty in categorizing the foods that are surveyed so that they correspond to the types of foods that contain the hazards.

Chemical risk assessment must take into account the fact that a chemical in food can undergo changes during processing and preparation. Thus, to realistically model exposure, an understanding of food chemistry of the hazardous chemical is necessary. In microbiological risk assessment, the concern is possible continuous growth and decline of pathogens in the food. Methodologies to realistically model chemical changes and microbial growth and decline are still under development by risk assessors.

Stakeholders should know that many technical assumptions for EA are based on very limited data. Because a great deal of information is not known with certainty, simplifying assumptions are often made that could lead to an overstatement of the confidence of the results. For example, the major distinguishing feature for microbial pathogens is modeling to account for the dynamics of microbial growth and decline, termed predictive microbiology. However, at the time of this writing, EA models for microbial pathogens in foods have not explicitly distinguished strain variability, which can be large for some bacterial pathogens. Data are usually available for only a few strains or a cocktail or mixture of several strains, which may differ taxonomically and biologically from the hazard of interest. From the behavior of a few strains, inferences are made for all strains, without regard to population variability. Another example is that predictive microbiology models are designed to be "conservative" rather than unbiased. Reasons for bias include features of the experimental design such as the use of high levels of pure cultures of a cocktail or mixture of pathogen strains grown under optimal conditions in complete nutrient broth in the total absence of the competing microflora of foods. In reality, pathogen growth is influenced by many factors not explicitly accounted for in the models. Another emerging facet of EA, in which simplifying assumptions are made, is modeling the potential for person-to-person transmission in addition to dietary exposure for certain foodborne disease agents (Eisenberget al., 1996).

Dose-Response Assessment (DRA), the third element of the NRC framework, involves modeling the relationship between the ingested dose of the hazard and the likelihood and severity of the adverse effect. Much of the work of dose-response modelers in chemical risk assessment involves analysis of data from animal studies. For microbial risk assessment animal studies are not often used, but the DRA depends primarily on data from a small set of controlled clinical studies in which human volunteers were administered the hazard, usually at high doses. In chemical risk assessment, mechanistic or genetic considerations could be applied that can contradict the results of animal studies (vt'im epa. go v/oppspsl Ifqpa).

When extrapolating beyond the observed range of the data from clinical or animal studies to the low-dose region, the model form can have dramatic effects on the outcome (Coleman, 1998). Another issue with which dose-

response modelers must wrestle, particularly in the microbial area, is the development of surrogate dose-response models in the absence of data for the hazard of interest. Chemical risk assessors make inferences about chemicals for which no dose-response information is known (U.S. EPA and LogiChem, 1997) from chemicals with a similarity of chemical structures for which some information is known. However, in the microbial area, apparently, knowledge or information is not available for making such inferences. Questions about the structural aspects of host-pathogen interactions must be considered to determine plausible surrogates. For microbial risk assessors, selection of surrogate dose-response models will continue to be of interest as long as new pathogenic strains evolve and are recognized. Outputs of the dose-response model are the frequency and severity of human foodborne illness at a given exposure or ingested dose.

The complexity of predicting frequency or probability and severity of illness must be emphasized by risk assessors. Illness is a complex function of variability in all aspects of the epidemiological disease triangle of host, pathogen, and environment (matrix) effects and their interactions. A clear association between age of human hosts and frequency of illness for microbial hazards has emerged from epidemiological surveillance and outbreak investigations (CDC, 1998; Coleman, 1998; Terajima et al., 1999). However, these data are not an ideal proxy for age dependence in dose-response relationships because ingested doses are unknown. Pathogen strains are likely to vary in many aspects of growth, physiology, and both the presence and expression of virulence genes. An example of an environmental effect is that fat in foods appears to provide a protective environment for pathogens, enabling them to survive in inhospitable surroundings. A tremendous amount of controversy is associated with DRA.

Risk Characterization (RC), the fourth element of the NRC framework, begins with linking the output of the EA models with the DRA models to predict the frequency and severity of human illness (the consequence) for given scenarios. RC commonly relies on techniques such as Monte Carlo simulation. Principles of good practice for Monte Carlo simulation have been published to guide risk assessors in developing sound risk assessment models (Burmaster and Anderson, 1994). The major output of RC is a series of distributions of the frequency and severity of illness for the subpopulations of interest.

Often, risk assessors may estimate illness for certain subpopulations under a baseline (as is) scenario and with interventions or possible system failures. Such a process bridges risk assessment and risk management activities and might include developing the concept of comparative risk, the comparison of simulation results for the baseline (as is) and various potential mitigation scenarios most relevant to policy makers. This type of analysis provides information about the relative contribution of different interventions to risk reduction that is necessary to support policy making.

A key analytical aspect of RC is the performance of sensitivity and uncertainty analyses to determine what variables most strongly affect the uncertainty of the risk estimate. Another aspect of RC involves validating the model or testing the predictive abilities of the model (Bowker, 1993). Standard statistical procedures such as goodness of fit testing and construction of confidence intervals for predictions can be used. However, because risk assessment models are often so complex, other procedures that are nonparametric are used for validation (Bowker, 1993). The most effective way of validating the model is comparing the estimates derived from the model with an independent source of data. The problem is that often there are questions concerning the validity of the independent source of data. For example, active surveillance data from the FoodNet study (USDA, 1998) provide some insight into the possible magnitude of the rates of illness from foodborne pathogens within and between the sentinel sites. However, a variety of difficulties exist in interpretation of these data.

Principles of appropriate analysis used in the RC, and more generally in the risk assessment, are given on pages 100-101 of an informative book, Understanding Risk: Informing Decisions in a Democratic Society (NRC, 1996). The principles are for the most part readily understood and include the following concepts: Analysis should be consistent with the state of the art; analysis should be checked for accuracy; assumptions should be clearly pointed out; and superfluous assumptions should be discarded. However, one standard or principle does create some difficulty, the principle that "Calculations are presented in such a form that they can be checked by others interested in verifying the results." There are two real problems associated with this principle. The first is that often the mathematical and statistical procedures used are so complex that, unless the computer programming is independently recreated, the results cannot be verified. In reality, verification of the analysis is just not possible for most interested parties. A second potential problem with this principle is that managers and risk assessors, in an attempt to adhere to this principle, may simplify procedures and adopt less than state-of-the-art methodologies, contradicting the first principle. Our preferred statement of the principle is that methodologies, including mathematical derivations and justification of statistical procedures, should be presented in a clear and complete fashion and in accordance with standard practices of the mathematical and statistical professions. Computer programs, in addition to reported results, should be made available to any interested party (USDA, 1998).

Dieting Dilemma and Skinny Solutions

Dieting Dilemma and Skinny Solutions

The captivating thing about diets is that you don't get what is researched or predicted or calculated but rather, you get precisely what you expect. If the diet resonates with you then it will likely work, if it doesn't resonate, it won't.

Get My Free Ebook

Post a comment