PATHOGEN VIRULENCE: THE EVOLUTION OF SICKNESS
Pathogens are disease-causing microorganisms whose negative effect on their host’s fitness can vary not only between host species, genotypes, and individuals but even between ecological circumstances and subpopulations (Dybdahl and Storfer, 2003). This reduction of host fitness, a highly sensitive characteristic dependent on numerous variables, is known as pathogenic virulence, and remains a topic of heated interest in health science and evolutionary biology today. What factors dictate the extent of damage a pathogen will inflict on its host? Because the long-term persistence of a pathogenic species is inextricably entwined with host survival, the answer to this question is complicated, and requires balancing conflicting selective forces at the individual and population level, as well as trade-offs between pathogen and host evolutionary strategies.
The evolutionary dynamic between pathogens and their hosts is difficult to simplify due to the nesting of pathogen populations within host populations (Gilchrist and Coombs, 2006). While a given physiological circumstance within the host, such as a weakened immune system, may favor the rapid proliferation of a population of pathogen individuals, this proliferation can often have a exacerbating effect on the host, worsening its condition, perhaps even killing it. Intuitively, this affects the fitness of the host, but it also poses a serious hit to the fitness of the pathogen. Ultimately, the death of the host necessitates the death of the bugs living in or on it, which eliminates any opportunistic advantage the bug originally had. The minimization of the inevitable trade-off between low host fitness and high pathogen fitness dictates what theoretical biologists call optimal virulence.
The study of optimal virulence is the study of long-term evolutionary strategies a pathogen adopts to maximize its persistence within a host population. Because virulence (read: low host fitness) is considered by most biologists to be an unavoidable consequence of host resource exploitation (Dybdahl and Storfer, 2003), one would assume that a pathogen should evolve to minimize virulence, all other factors being equal. Presumably, a pathogen that has a minimal negative effect on its host will benefit from low host mortality and high host reproduction, factors that in turn maximize the pathogen’s available resource space (ie: the number of susceptible hosts). However, a pathogen also needs a high transmission rate to effectively exploit this resource space, and good transmission usually requires a high pathogenic load within the host (Dybdahl and Storfer, 2003). Pathogenic load is in turn usually correlated with virulence (Porco et al., 2005). Simply put, a bug has to increase its numbers to increase its chances of jumping from host to host, but the denser the bug population in the host, the more intense (and detrimental) its effect on the host will be. Thus a delicate balance must be struck between a bug’s short-term interests of jumping ship, and its long-term interests of, well, not sinking it.
In theory, a pathogen should achieve the highest fitness with a high transmission rate and a low host recovery rate, while its host should have a high level of activity and a low mortality rate (Porco et al., 2005). The high transmission and low recovery rate ensures the persistence of the pathogen between hosts and within hosts, while the high host activity and low mortality rate maximizes contact between hosts and the number of hosts available. But this isn’t all that easy to achieve: as previously mentioned, virulence can have opposing effects on many of these factors—it may increase transmission rate and lower recovery rate but it also increases host mortality and lowers host activity. Moreover, the self-interest of the host to not die creates another conflict between pathogen transmission rate and host recovery rate: while theory may say transmission should be high and recovery should be low, host populations will inevitably evolve to fight against this. But the faster a host individual can clear an infection, the smaller the window of time a pathogen has to transmit to a new host. As this window gets smaller, natural selection will favor the evolution of increased virulence, since this usually increases transmission for the reasons stated above (Porco et al., 2005). But the more virulent the strain, the greater chance a host will die, again putting the pathogen in another fitness snafu. If this didn’t seem complicated enough, it gets tougher.
If the transmission rate of a pathogen were solely dependent on its density in the host, then optimal virulence would simply vary between species and circumstances based on the balancing of these factors. However, transmission rate is also related to a pathogen’s ability to evade the host’s immune system (Dybdahl and Storfer, 2003). A pathogen that can manage to invade a new host without triggering an immediate immune response should be able to allocate the majority of its resources towards replication, while a pathogen that is easily recognized by the host’s immune system may be more strongly selected for immune resistance, thus possibly compromising its reproductive rate. Furthermore, a pathogen that minimizes immune detection minimizes the immune response of the host, which in some cases can be more taxing on immediate host fitness than the presence of a pathogen or antigen itself (Tovey and Kemp, 2005). The trade-off in the host then comes between minimizing its immune response and clearing the infection. But contrary to the eliminate-all-bugs attitude of Western medicine, from an evolutionary perspective sometimes letting a pathogen stick around can work out. In contrast to the downward spiral of selecting for fast recovery and in turn high virulence, a bug can also maximize its fitness with a low clearance rate and lower virulence as long as it doesn’t cause huge fitness costs and contact between hosts remains frequent and consistent.
Host-to-host contact is not the only way a pathogen keeps moving around. Transmission also depends on a pathogen’s specialization on specific hosts and transmission methods. While theory predicts that high host specificity would select for high virulence (since the narrower the range of hosts, the lower the chance a pathogen will find a new one to exploit,) restricted host range has in some cases been attributed to low virulence (Herre, 1993). For instance, the endosymbiotic bacteria Wolbachia pipientis has restricted movement in that it has evolved to infect hosts strictly through maternal vertical transmission: Wolbachia infects the reproductive tissues of females and is passed onto their offspring (McGraw, et al., 2002). Because the bacterium is only transmitted through the successful reproduction of its host, unlike horizontally transmitted pathogens, any significant virulent effects on its host’s fitness is directly damaging to its own fitness. Therefore, theory predicts that the endosymbiont will optimally balance its replication rate such that the probability of germ line infection is as high as possible with minimal detrimental effect to host fecundity and offspring fitness. This prediction has in fact been supported in recent experimental studies (McGraw, et al., 2002).
Virulence is obviously a difficult biological trait to predict and remains a pressing issue to resolve. Not only is it of interest in theoretical biology, its minimization is a driving force in health science as a way to increase quality of life in a highly dense, resource-strained global population. Studies in optimal virulence demonstrate that traditional short-term interventions that attempt to eradicate pathogens can often have surprising long-term effects on virulence (Porco, et al., 2005), as pathogens rapidly evolve in response to artificially imposed constraints on transmission and replication. Further investigations into the long-term evolutionary behaviour of pathogens will hopefully continue to illuminate human understanding of disease transmission and be a guiding light in the establishment of epidemiological control.
References:
Dybdahl, M.F. and Storfer, A. (2003). Parasite local adaptation: Red Queen versus Suicide King. TRENDS in Ecology and Evolution, 18(10): 523 – 530.
Gilchrist, M.A., and Coombs, D. (2006). Evolution of virulence: Interdependence, constraints and selection using nested models. Theoretical Population Biology, 69: 145-153.
Herre, E. A. (1993). Population structure and the evolution of virulence in nematode parasites of fig wasps. Science, 259: 1442-1445.
McGraw, E.A., Merritt, D.J., Droller, J.N., and O’Neill, S.L. (2002). Wolbachia density and virulence attenuation after transfer into a novel host. Proceedings of the National Academy of Sciences of the United States of America, 99: 2918-2923.
Porco, T.C., Lloyd-Smith, J.O., Gross, K.L., and Galvani, A.P. (2005). The effect of treatment on pathogen virulence. Journal of Theoretical Biology, 233: 91-102.
Tovey, E.R. and Kemp, A.S. (2005). Allergens and allergy prevention: where to next? Journal of Allergy and Clinical Immunology, 116(1): 119-121.