Malaria: The Forgotten Killer
Skip other details (including permanent urls, DOI, citation information)
This work is protected by copyright and may be linked to without seeking permission. Permission must be received for subsequent distribution in print or electronically. Please contact firstname.lastname@example.org for more information. :
For more information, read Michigan Publishing's access and usage policy.
In April 1995, an outbreak of Ebola virus infection transfixed the world. Of the 296 people infected in Kikwit, Zaire between January and May, approximately 250 died. Both the rapid response by local and international health authorities, and the nature of the disease itself, helped bring this deadly event quickly under control.
However, residents of Kikwit and most other regions of sub- Saharan Africa still contend with many other potentially fatal diseases day after day. One infectious disease in particular kills well over ten times more Africans each day than the Ebola virus did in a five-month period. This other disease is malaria.
Malaria is a blood disease resulting from infection with a protozoan parasite known as Plasmodium. Over 100 species of the malaria parasite exist, capable of infecting various hosts such as reptiles, birds, rodents, and primates. However, only four species of Plasmodium can infect humans, the most virulent being Plasmodium falciparum.
The global malaria situation is far worse today than at any time in the past century. From 200 to 300 million people become infected with malaria parasites every year, and more people die of malaria each year than have ever died of AIDS. World-wide, infections caused by Plasmodium falciparum probably result in over 4,000 deaths each day, and over 1.5 million deaths each year. 90 percent of these deaths occur in sub-Saharan Africa. Only one other single microbial species kills a greater number of people each year: Mycobacterium tuberculosis, the causative agent of TB.
Malaria and History
Surprisingly, malaria was common in the U.S. and Europe until around 1950. First described by Hippocrates, it crippled historical figures ranging from Alexander the Great to Oliver Cromwell. Until very recently, infectious diseases have debilitated more soldiers than battlefield injuries. Malaria played an important role in early American history, especially in the swampy marshland of Northeast Virginia. In 1781, British General George Cornwallis surrendered to George Washington at Yorktown owing largely to the fact that more than half of some companies of the British fleet were deathly ill with malaria. Malaria accounted for 20 percent of all hospitalizations during the American Civil War. During World War II, the U.S. military organized an intensive project involving hundreds of scientists at universities and drug companies to find a new treatment for malaria.
Even today, the threat of malaria transmission is not completely limited to tropical countries. In the late 1980s, 30 malaria cases were identified in the San Diego area. Many of the afflicted had never traveled outside the U.S., becoming the first documented cases of mosquito-to-human malaria transmission within the U.S. since the early 1950s.
The malaria parasite is transmitted to humans through the bite of an infectious female Anopheles mosquito. While male mosquitoes feed exclusively on plant nectar, female mosquitoes instinctively seek a blood meal about every two to three days. Nutrients in human blood serve as a required source of iron for the successful maturation of eggs inside the female.
Unfortunately, the consequences of female feeding behavior are not limited to an itchy short-term bump on the skin. When an uninfected female anopheline ingests a minute amount of blood from a person infected with P. falciparum parasites, she also inadvertently ingests a few of the parasites circulating in the blood of the individual. After a two-week period of parasite development inside the mosquito, it may then serve as a vector when it feeds on another human. After probing and piercing the human skin, the mosquito shoots a tiny quantity of saliva into the persons capillary in order to prevent the blood from clotting as she feeds. This microscopic drop of saliva contains the deadly parasite, completing the successful transmission of the parasite.
“Transmission intensity” refers to the frequency with which an individual is exposed to infective mosquito bites. In certain African regions where transmission intensity is most extreme, residents are exposed to as many as 500 infective mosquito bites each year. During the high season, people can be exposed to five or six infective bites each day for several consecutive weeks. In certain regions even a 95 percent reduction in exposure would still leave residents susceptible to 25 infective bites per year.
Many health experts believe the malaria situation is intensifying, and not just in sub-Saharan Africa where malaria has long been a leading cause of death among children under five years of age. In parts of Thailand, the malaria parasite has developed resistance to all commonly used anti-malaria drugs. The clearing and settlement of mosquito-infested rain forest has resulted in increased transmission of the parasite throughout Central and South America. Yet, many countries lack the resources to deploy approaches developed in recent years to combat malaria.
Consequently, world health experts do not emphasize eradication of the P. falciparum infection. Instead, funds and personnel are allocated to prevention and intervention methods proven to reduce the extent of severe illness and death from P. falciparum infection. While these efforts have been partially successful, malaria remains a leading public health concern in many parts of the world.
The Asembo Bay study
Only one to two percent of African children infected with P. falciparum develop severe clinical malaria and die. What explains the 98 percent of children who do not experience a life- threatening malaria attack? Data suggest certain genetic traits protect against malarial death, but these do not explain all the difference. The secret to reducing malaria mortality depends on a better understanding of the risk factors for severe malaria illness.
Is it possible to limit the extent of severe disease and death due to malaria by reducing the intensity of P. falciparum transmission? What would malaria control activities accomplish if transmission intensity is only reduced, but not entirely eliminated? One of the world’s largest and longest running malaria field study sites is located at Asembo Bay in western Kenya, where the U.S. Centers for Disease Control and Prevention (CDC) and the Kenya Medical Research Institute (KEMRI) have conducted extensive investigation of malaria epidemiology and immunology for more than ten years. Obtaining malaria data is a complex and costly endeavor, requiring collaboration between any number of agencies, experts and field investigators. In preparing to conduct his doctoral dissertation work on risk factors for severe malaria illness and death, Peter McElroy recently traveled to Kenya to discuss such collaboration with scientists from CDC and KEMRI. Working together for almost 15 years, the CDC-KEMRI team has been pursuing these questions in one of the world’s largest community-based malaria research field sites. The study currently includes over 600 child-bearing aged women and 1,200 infants and young children from among 15 villages.
Data collection begins when a woman becomes pregnant. Each newborn is then closely monitored from birth through age three. In addition to detailed clinical data, parasitologic, epidemiologic, and entomologic data are collected on a bi-weekly basis.
“It is a very big challenge for us to keep all the mothers enrolled in the study [for the three-year period],” says Michael Onyango, a native of the Asembo Bay, Kenya study area who now oversees the activities of over 100 field workers employed by the project. “Mothers often associate their child’s sickness with the fact that a tiny blood specimen is collected every two weeks. Convincing them otherwise often takes a great deal of patience.”
Data from such studies, where mothers and children are monitored continuously over time rather than only at a single point, offer crucial information regarding risk factors for severe malaria episodes and death. Peter McElroy will soon assist in the analysis and interpretation of the data collected from these large-scale field studies. The future direction of malaria control activities, including drug development and vaccine design and evaluation, will be strongly influenced by the results from western Kenya.
Vaccines and new drugs for malaria
Developing vaccines against P. falciparum is a complicated scientific endeavor as the organisms antigen complexity changes several times during the course of an infection. However, progress has been made towards development of a vaccine. In the U.S., research teams from the CDC, Department of Defense, National Institutes of Health, and several universities are working together in pursuit of a multi- valent malaria vaccine that will target several stages of the parasite’s life-cycle.
Several potential vaccines for different populations are currently at different stages of testing and development. A vaccine that completely protects against P. falciparum infection (an anti- infection vaccine) during brief visits to a particular region may be appropriate for travelers and military personnel, but inappropriate for life-long residents. A vaccine that does not prevent infection, but reduces the severity of clinical malaria episodes (an anti-disease vaccine) will benefit children and adults living in malaria endemic areas.
Also on the scene is a potential vaccine that will interrupt the malaria transmission cycle by blocking the development of the parasite inside mosquitoes that have fed on infected humans. The recipient of a so-called “transmission blocking” vaccine may still become infected, and possibly become sick and die; thus such a vaccine may seem counter-intuitive or even unethical. However, if development is successful, this vaccine strategy has tremendous potential to reduce P. falciparum transmission pressure in some areas, and ultimately eliminate it from many others.
In 1991, preliminary South American trials of a synthetic protein appeared very promising, with over 70 percent less malaria illness among vaccinated subjects. However, a subsequent large-scale trial of this vaccine in 1992 in Tanzania, where the transmission pressure is far more intense, suggested that the efficacy of the vaccine was only around 32 percent. Epidemiologists await the results of several other studies currently under way.
As malaria parasites develop resistance to existing vaccines, researchers perpetually search for new anti-malarial drugs. One of the most promising “new” classes of drugs — artemisinin derivatives — are new only to western medicine. Artemisinin was first discovered by Chinese scientists who were investigating an ancient herbal remedy, made from leaves of the Artemisa annua plant, which proved to have remarkable anti-malarial qualities. Derivatives of artemisinin have been used to treat over three million people, primarily in China and Southeast Asia. Several of the derivatives should be licensed in the West over the next few years.
Artemisinin can be extracted from plants quite inexpensively; thus under-developed countries can manufacture their own anti- malarials for domestic consumption. Vietnam currently produces millions of doses of artemisinin for internal use. Leigh Pearce and Anne Castles, two masters students in the School of Public Health, spent a summer in Vietnam analyzing the efficacy and cost-benefits of this treatment, while other School of Public Health researchers are studying artemisinin use in Thailand.
At the University of Michigan, Steven Meshnick has elucidated artemisinin’s mechanism of action. Artemisinin derivatives react with iron inside the parasite to generate free radicals and damage specific parasite proteins. These proteins are now in the process of being molecularly cloned. This information should aid in the development of new and more effective derivatives.
Sadly, major changes in the pharmaceutical industry have had profoundly negative effects on malaria research. Major drug companies are doing little or no work on developing new anti-malarials. While artemisinin derivatives should work for some years, there is nothing “in the pipeline” to replace them when resistance develops.
The ability of the P. falciparum parasite and the Anopheles mosquitoes to continuously adapt to new environments keeps them ahead of scientific advances. Successful attempts to reduce malaria mortality will only be accomplished with carefully planned strategies that integrate many facets of control.
Some public health researchers believe malaria control may only be achieved after a population has developed a certain level of economic prosperity; others argue that economic development cannot be realized until a population has achieved a certain level of health. While this debate remains unresolved, it is clear that successful biomedical advances are often not implemented in economically depressed areas.
Most developing countries face extremely difficult decisions regarding the allocation of financial and personnel resources. Unless future measures assure improved protection at low cost, officials will hesitate to embrace them, regardless of their promise. Even if scientific advances lead to highly effective control strategies, whether drug or vaccine, their suitability for developing countries must be considered.
Peter McElroy is a doctoral student in Epidemiology, School of Public Health; he traveled to Kenya during the summer of 1995 with support from the International Institute. Steven Meshnick is associate professor of Epidemiology.