Background And Historical Significance

Although some form of risk assessment has assisted regulatory agencies in making decisions about chemicals in food since the early 1900s, the field of chemical risk assessment is still in its infancy and is evolving rapidly. The first comprehensive guidelines for performing chemical risk assessments in the U.S. were published in 1983 (NRC, 1983). These guidelines separated the process of risk assessment into four components: 1) hazard identification, 2) dose response evaluation, 3) exposure assessment, and 4) risk characterization. Several improvements in the risk assessment process have been made since this report was published, but it still forms the basis for contemporary risk assessment approaches for food chemicals such as pesticide residues, food additives, naturally occurring toxins, hormones, antibiotics, environmental contaminants, and even novel products derived from food biotechnology applications.

Advances in food chemical safety risk assessment have frequently involved pesticide residues in foods. In 1993, a report of the National Research Council (NRC, 1993) suggested many improvements to the risk assessment policies used by the U.S. Environmental Protection Agency (EPA) to determine the acceptability of residues of pesticides in the food supply. This report recommended, among other things, that the EPA consider the potential susceptibility of infants and children to pesticide residues and also the exposure of the population to water and residential sources of pesticides in addition to dietary sources. The report also recommended that risk assessments be made for families of toxicologically related pesticides that cause their effects through a common mechanism of action rather than on a chemical-by-chemical basis.

Many of these recommendations were incorporated into law when President Clinton signed the Food Quality Protection Act (FQPA) of 1996. This law prescribed risk assessment approaches to be used by the EPA. Major provisions of the law included the so-called "lOx factor" requiring the EPA to consider whether to apply up to a 10-fold additional uncertainty factor to provide greater protection for infants and children, the "aggregate exposure" provision requiring exposure to be calculated from food, water, and residential exposure, and the "cumulative exposure" provision to determine risks for families of chemicals whose members share a common mode of toxicological action. Ironically, the FQPA did not arise from documented cases of excessive exposure to pesticide residues but rather as a legislative "fix" of the anachronistic 1958 Delaney Clause that, based on recent legal decisions, called for elimination of many uses of pesticides on statutory grounds instead of health risks (Winter, 1993).

The FQPA provisions present significant new challenges to the scientific community and will help shape the processes by which the risks from all types of chemicals in food, including pesticide residues, will be determined.


The estimation of exposure to food chemicals requires an understanding of both the amount of chemical present in food and the amount of food consumed. The basic algorithm for food chemical exposure can be represented as follows:

Exposure = Food Consumption x Residue Level

In the case of a chemical that may be present on more than one food commodity, the estimated exposure would represent the summation of all the individual commodity exposures.

Deterministic Exposure Modeling

Historically, exposures have frequently been calculated with a "deterministic" approach that assigns finite values to both the food consumption and residue levels to calculate a "point" estimate of exposure. As an example, a deterministic exposure estimate for pesticide A on commodity X would require knowledge of the residue level of pesticide A and the food consumption of commodity X. Frequently, with a prudent method unlikely to underestimate exposure, the level of pesticide A might be chosen to represent a maximum legal or maximum detected level rather than a more typical value. Food consumption of commodity X could be chosen to represent the per capita mean consumption or might be chosen to represent a higher level such as the upper 95th percentile of consumption. The choices of residue and food consumption levels are often, although not always, exaggerations of typical values and frequently lead to calculations of worst-case or unrealistic exposures (Archibald and Winter, 1989). Such deterministic approaches are valuable in cases in which the worst-case exposure estimates are still considered to be well within acceptable levels because refinements to improve the accuracy of the exposure assessments would not be necessary. Deterministic approaches also allow for the use of refinements such as substituting "anticipated" residues for maximum legal residues; such an approach may often drive exposure estimates below the levels of concern. Unfortunately, worst-case exposure scenarios are often communicated without reference to the potential degree of exaggeration and as such may lead to an exaggerated perception of the degree of risk (Winter, 1994).

In practice, deterministic approaches to predict long-term (chronic) exposure to pesticides in food tend to use more realistic estimates (i.e., average residue, median per capita daily consumption) than those approaches predicting short-term (acute) exposure (i.e., maximum legal or detected residue, upper 95th or upper 99th percentile consumption).

The preferred method for calculating chronic exposure is still to use a deterministic approach. For the estimation of acute exposures, however, deterministic approaches are frequently being replaced with "probabilistic" approaches that take advantage of improvements in our computational capabilities and are far more data intensive than deterministic methods.

Probabilistic Exposure Modeling

In the real world, neither residue level nor food consumption data exist as single values but are more appropriately depicted as distributions (Petersen, 2000). Monitoring of pesticide X on commodity A, for example, would likely demonstrate that the majority of samples contain little or no detectable residue of pesticide X while a lower percentage would show moderate levels and an even lower percentage would indicate high residue levels (Fig. 6.1). A similarly shaped distribution curve might be envisioned for the daily consumption level of commodity A; on most days, the commodity might not even be consumed, and moderate consumption of the commodity is more likely than a high level of consumption (Fig. 6.2).

Probabilistic approaches utilize our current computational capabilities to combine all of the data in the residue distribution with the food consumption data to develop a distribution of daily exposure (Fig. 6.3). This type of approach is frequently called a Monte Carlo simulation model, although probabilistic approaches may be conducted in a variety of different methods utilizing varying types of data, algorithms, and assumptions (Petersen, 2000).

In the simplest case for estimating acute exposure from a single pesticide on a single commodity, a Monte Carlo analysis would randomly select a residue

Natural Weight Loss

Natural Weight Loss

I already know two things about you. You are an intelligent person who has a weighty problem. I know that you are intelligent because you are seeking help to solve your problem and that is always the second step to solving a problem. The first one is acknowledging that there is, in fact, a problem that needs to be solved.

Get My Free Ebook

Post a comment