Methods for determining the in vitro antimicrobial activity of natural compounds

To apply a naturally occurring antimicrobial to a food requires that one determine the efficacy of the compound in vitro (i.e., microbiological media) and in a food product. In an in vitro system, a number of variables or factors concerning the antimicrobial can be evaluated. It is very important to evaluate the activity of a potential antimicrobial against multiple strains of pathogen since strain variation may occur. Another important variable is the initial number of microorganisms in the system. Since most antimicrobials are bacteriostatic rather than bactericidal, the higher the initial number, the shorter the shelf life of the product.

The agar diffusion method has probably been the most widely used method for determination of antimicrobial activity throughout history. In the test, antimicrobial compound is added to an agar plate on a paper disk or in a well. The compound diffuses through the agar resulting in a concentration gradient which is inversely proportional to the distance from the disk or well. Degree of inhibition, which is indicated by a zone of no growth around the disk or well, is dependent upon the rate of diffusion of the compound and cell growth (Barry, 1986). Therefore, the antimicrobial evaluated should not be highly hydrophobic, as the compound will not diffuse and little or no inhibition will be detected. Results of this test are generally qualitative.

Agar and broth dilution assays are generally used when quantitative data are desired. In both methods a single statistic, known as the minimum inhibitory concentration (MIC), is generated. In the dilution assays, a number of containers are prepared to contain a single concentration of antimicrobial in a microbiological medium. A test microorganism is exposed to the antimicrobial and incubated for a specified period, usually at least 24 hr. The MIC is generally defined as the lowest concentration of an antimicrobial that prevents growth of a microorganism after the specified incubation period.

These methods provide little information concerning the effect of an antimicrobial on the growth or death kinetics of a microorganism. Concentrations of an antimicrobial which are below the MIC may still cause an increased lag phase, reduced growth rate or even initial lethality followed by growth. In food products, total inhibition of a pathogen or spoilage microorganism is not always required. An increased lag phase, especially under conditions of severe abuse, is often sufficient to protect the consumer. Therefore, to determine the effect of a compound on the growth (or death)

kinetics of a microorganism, a method is required that produces an inhibition curve using a colony count procedure. In clinical microbiology, these inhibition curves are known as 'time-kill curves' (Schoenknecht et al., 1985). This method is versatile but has some disadvantages including the fact that no single statistic is produced to compare treatments such as MIC and it is labor intensive and expensive. Progress made in modeling of the growth kinetics of microorganisms (Whiting and Buchanan, 2001) has allowed for improved statistical analysis of growth/inhibition curves in the presence of food antimicrobials.

A second method for determining antimicrobial effectiveness over time is to measure turbidity increases with a spectrophotometer. A major disadvantage to this type of analysis is sensitivity of the instrument. Spectrophotometers generally require log 6.0-7.0 CFU/ml for detection (Piddock, 1990). This may create a situation in which no growth (i.e., no absorbance increase) is observed when, in fact, undetectable growth is occurring at levels below log 5.0 CFU/ml. An erroneous interpretation of 'lethality' could result (Parish and Davidson, 1993).

Was this article helpful?

0 0

Post a comment