On a Sunday morning in a decaying and dangerous inner-city barrio in Lima, Peru, an unmarked white van carrying nearly a dozen bodies rumbles to a stop on the grounds of the National Institute of Neurological Sciences. Seated in a small waiting area to the rear of the building, a throng of well-dressed researchers and government officials watches intently. As the driver clambers out, an assistant hustles off in search of a hospital gurney. Within minutes, two men wheel the first body into the institute's imaging unit.

Onlooker Caleb Finch, a biologist at the University of Southern California, has been waiting for this moment for months. Tall, gaunt and graying, with a Father Time–style beard, the 75-year-old scientist has devoted his career to the study of human aging. Our kind is remarkably long-lived compared with other primates. Our nearest surviving relatives, the chimpanzees, have a life expectancy at birth of about 13 years. In contrast, babies born in the U.S. in 2009 possessed a life expectancy at birth of 78.5 years. Finch has come to Lima to find out why—by peering into the distant past. The cadavers in the van belong to men, women and children who perished along this stretch of coastal desert as much as 1,800 years ago, long before the Spanish conquest. Cocooned in dusty textiles and interred in arid desert tombs, their naturally mummified bodies preserve critical new clues to the mystery of human longevity. As envoys from an era long before modern health care, they will offer case studies of aging in the past. Finch walks over to the van, grinning as he surveys the cargo. “That's a pack of mummies,” he says.

Most researchers chalk up our supersized life span to the advent of vaccines, antibiotics and other medical advances, the development of efficient urban sanitation systems, and the availability of fresh, nutritious vegetables and fruit year-round. Indeed, much demographic evidence shows that these factors greatly extended human life over the past 200 years. But critical as they were to extending human life, they are only part of the longevity puzzle, Finch warrants. Marshaling data from fields as diverse as physical anthropology, primatology, genetics and medicine, he now proposes a controversial new hypothesis: that the trend toward slower aging and longer lives began much, much earlier, as our human ancestors evolved an increasingly powerful defense system to fight off the many pathogens and irritants in ancient environments. If Finch is right, future research on the complex links among infection, host defense and the chronic diseases of the elderly may revolutionize scientists' understanding of aging and how to cope with the challenges it brings.

And many more
Hints that modern health practices might not be solely responsible for our long life span have come from studies of contemporary hunter-gatherer groups. In 1985 Nicholas Blurton-Jones, a biological anthropologist at the University of California, Los Angeles, set off by Land Rover across the trackless bush in Tanzania's Lake Eyasi basin. With field assistant Gudo Mahiya, Blurton-Jones traveled to the isolated camps of the Hadza, hunter-gatherers who lived much as their ancestors had, hunting baboons and wildebeest, digging starchy tubers and collecting honey during the rainy season from hives of the African honeybee. Journeying from one camp to another, the two researchers collected basic demographic data, checking each Hadza household and recording the names and ages of the inhabitants. Then the pair updated this census information six times in the 15 years that followed, noting down the names of all who had died and the causes of their death. In addition, Blurton-Jones obtained some earlier census data on the Hadza from two other researchers.

The Hadza lived—as ancient humans and chimpanzees did—in a natural environment teeming with pathogens and parasites. They lacked running water and sewage systems, defecating in a zone 20 to 40 meters away from their camps, and they rarely sought out medical care. Yet as Blurton-Jones and Mahiya discovered, the Hadza enjoyed much longer lives than chimpanzees did. Indeed, the Hadza had a life expectancy at birth of 32.7 years. And if they reached adulthood, they could expect to live 40 more years, nearly three times longer than a chimpanzee reaching adulthood. Some Hadza elders survived into their 80s. Clearly, their relatively long lives owed little to medical and technological advances.

Moreover, the Hadza were not alone. In 2007 two anthropologists, Michael Gurven of U.C. Santa Barbara and Hillard Kaplan of the University of New Mexico analyzed data from all five modern hunter-gatherer societies that researchers had studied demographically. Infections counted for 72 percent of the deaths, and each group revealed a very similar J-shaped mortality curve—with child mortality as high as 30 percent, low death rates in early adulthood and exponentially rising mortality after the age of 40. Then Gurven and Kaplan compared these curves with those of both wild and captive chimpanzees: the simians experienced the sharp uptick of adult mortality at least 10 years earlier than human hunter-gatherers. “It appears that chimpanzees age much faster than humans,” concluded Gurven and Kaplan in their paper detailing the findings, “and die earlier, even in protected environments.”

Yet when, exactly, did humans begin living longer? To obtain clues, anthropologists Rachel Caspari of Central Michigan University and Sang-Hee Lee of U.C. Riverside examined the remains of 768 individuals from four ancestral human groups spanning millions of years. By assessing the degree of dental wear, which accumulates at a constant pace from chewing, they estimated the ratio of young adults around 15 years of age to older adults around age 30 (old enough to be a grandparent) in each of the four groups. Their studies revealed that living to 30 and beyond became common only recently in our prehistoric past. Among the australopithecines, which emerged in Africa around 4.4 million years ago, most individuals died before their 30th birthday. Moreover, the ratio of thirtysomethings to 15-year-olds was just 0.12. In contrast, Homo sapiens who roamed Europe between 44,000 and 10,000 years ago often lived to 30 or more, achieving a ratio of 2.08 [see “The Evolution of Grandparents,” by Rachel Caspari].

Calculating the life expectancy of early H. sapiens populations is challenging, however: detailed demographic data, such as those supplied by both census records and death registrations, are lacking for much of our long past. So Finch and his colleague Eileen Crimmins, a gerontologist at the University of Southern California, analyzed the earliest, virtually complete statistical set of that nature available—data first gathered in Sweden in 1751, decades before the advent of modern medicine and hygiene. The study revealed that mid-18th-century Swedes had a life expectancy at birth of 35. But those who survived bacterial infections and contagious diseases such as smallpox during childhood and reached the age of 20 could reasonably look forward to another 40 years.

To Finch, these findings raised a major question. The 18th-century Swedes lived cheek by jowl in large, permanent villages, towns and cities, where they were exposed to serious health risks unknown to small communities of mobile chimpanzees. So why did the Swedes live longer? The answer, it turns out, may lie in the meaty diets of their early human ancestors and the evolution of genes that protected them from the many hazards of carnivory.

Meat-eating genes
Chimpanzees spend most of their waking hours in a sweet pursuit: foraging for figs and other ripe fruits. In search of fructose-rich fare, they range over large territories, only occasionally using the same night nest twice in a row. They are skilled at hunting small mammals such as the red colobus monkey, but they do not deliberately set out searching for these prey. Nor do they consume large quantities of meat. Primatologists studying wild chimpanzees in Tanzania have calculated that meat makes up 5 percent or less of the simians' annual diet there, whereas research in Uganda shows that animal fat constitutes only 2.5 percent of their yearly fare by dry weight.

In all likelihood, Finch says, the earliest members of the human family consumed a similar plant-based diet. Yet sometime between 3.4 million and 2.5 million years ago, our ancestors incorporated a major new source of animal protein. As sites in Ethiopia show, they began butchering the remains of large, hoofed mammals such as antelopes with simple stone tools, smashing the bones to get at the fat-rich marrow, slicing off strips of meat, and leaving behind telltale cut marks on femurs and ribs. And by 1.8 million years ago, if not earlier, humans began actively hunting large game and bringing entire carcasses back to camp. The new abundance of calories and protein most likely helped to fuel brain growth but also increased exposure to infections. Finch suggests that this risk favored the rise and spread of adaptations that allowed our predecessors to survive attacks by pathogens and thus live longer.

The trend toward increasing carnivory would have exposed our ancestors to pathogens in several ways. Early humans who scavenged the carcasses of dead animals, and who dined on raw meat and viscera, boosted their chances of ingesting infectious pathogens. Moreover, as humans took up hunting large animals, they faced greater risks of lacerations and fractured bones when closing in on their prey: such injuries could lead to deadly infections. Even cookery, which may have emerged as early as one million years ago, if not earlier, posed perils. Inhaling wood smoke daily exposes humans to high levels of endotoxins and soot particles. Roasting and charring meat improves both the taste and digestibility but creates chemical modifications known as advanced glycation end products, which contribute to serious diseases such as diabetes. Our ancestors' later embrace of agriculture and animal husbandry, which began some 11,500 years ago, added new dangers. The daily proximity of humans to domesticated goats, sheep, pigs, cattle and chickens, for example, elevated the risk of contracting bacterial and viral infections from animals. Moreover, as families settled permanently in villages, sewage from humans and livestock contaminated local water supplies. Pathogenic bacteria thrived.

Even so, humans exposed to such health risks in 1751 in Sweden lived longer than their simian relatives. To tease out clues to this longevity, Finch began studying the scientific literature on chimpanzee and human genomes. Previously published studies by others showed that the two genomes were around 99 percent identical. But in the uniquely human 1 percent, evolutionary biologist Hernán Dopazo, then at the Prince Felipe Research Center in Valencia, Spain, and his colleagues discerned a disproportionately high number of genes that had undergone positive selection and that played key roles in host defense and immunity—specifically in a part of the defense system known as the inflammatory response. Positive selection favors genes that hone our ability to survive and reproduce, which allows them to become more frequent in populations over time, a process that leaves a distinctive “signature” in the DNA sequence. Dopazo's findings added new weight to an idea growing in Finch's mind. He wondered if natural selection had endowed ancient humans with a souped-up system for fighting off the microbial threats and warding off other health hazards posed by increased meat consumption, thereby extending our life span.

In the war against bacteria, viruses and other microbes that seek to invade our tissues, the human host defense system brandishes two powerful weapons: the innate immune system and the adaptive immune system. The innate system is the first responder. It mobilizes immediately at the scene of an attack or injury to eliminate pathogens and heal damaged tissue, and it essentially responds in the same way to all threats. The adaptive system, in contrast, kicks into gear more slowly, customizing its response to particular pathogens. In doing so, it creates an immunological memory that confers lifelong protection against the invader.

The inflammatory response is part of the innate immune system. It goes to work when tissues suffer damage from microbes, traumatic wounds, injuries or toxins, and, as Finch points out, physicians have long recognized its hallmarks. Some 2,000 years ago Aulus Cornelius Celsus, a Roman medical author, described four cardinal signs of inflammation—heat, redness, swelling and pain. The heat and redness come from a swift and marked increase in the flow of warm blood to the damaged tissue. Swelling then results from increased vascular permeability, which causes blood cells and plasma to leak into the affected area, carrying proteins that can assist in preventing the spread of infection and in initiating wound healing.

Finch began examining the human-specific changes in genes related to host defense. He was quickly struck by the changes that had affected the apolipoprotein E (APOE) gene. This important gene strongly influences the transport and metabolism of lipids, the development of the brain and the workings of the immune system. It has three primary, uniquely human variants (alleles), of which APOE e4 and APOE e3 are the most prevalent.

APOE e4's DNA sequences closely resemble those in chimpanzee APOE, strongly suggesting that it is the ancestral human variant that emerged near the beginning of the Homo genus more than two million years ago and thus may have had the earliest effect on our longevity. Differing in several critical amino acids from the chimp version, APOE e4 vigorously ramps up the acute phase of inflammation. It boosts the production of proteins such as interleukin-6, which helps to increase body temperature, and tumor necrosis factor–alpha, which induces fever and inhibits viruses from replicating. Equipped with this supercharged defense system, children in ancient human families had a better chance of fighting off harmful microbes that they unwittingly ingested in food and encountered in their surroundings. “When humans left the canopy and went out onto the savanna,” Finch notes, “they had a much higher exposure to infectious stimuli. The savanna is knee-deep in herbivore dung, and humans were out there in bare feet.”

Moreover, early humans who carried APOE e4 most likely profited in another key way. This variant facilitates both the intestinal absorption of lipids and the efficient storage of fat in body tissue. During times when game was scarce and hunting poor, early APOE e4 carriers could draw on this banked fat, upping the odds of their survival.

Even today children who carry APOE e4 enjoy an advantage over those who do not. In one study of youngsters from impoverished families living in a Brazilian shantytown, APOE e4 carriers succumbed to fewer bouts of diarrheal disease brought on by Escherichia coli or Giardia infections than noncarriers did. And they scored higher on cognitive tests, most likely as a result of their greater absorption of cholesterol—a dietary requirement for neurons to develop in the brain. “So this would have been adaptive, we think,” Finch remarks.

A deferred cost
All told, APOE e4 seems to be a key part of the puzzle of human longevity. Ironically, now that we live longer, this gene variant appears to double-cross us later in life. Its debilitating effects became apparent only as our human ancestors increasingly survived to middle age and beyond. In Lima, Finch and an international team of cardiologists, radiologists, biologists and anthropologists are searching for traces of these afflictions in the preserved cardiovascular tissues of ancient adult mummies.

Inside the crowded imaging unit in Lima, Finch hovers over a technician's computer. It has been a long, trying morning. Several of the mummy bundles transported to the unit are too large to fit into the CT scanner. Others, when scanned, reveal little more than skeletal remains, raising doubts that the preservation of human tissue in the bundles will be adequate for the study.

But no one is giving up. On the screen is a crisp, three-dimensional CT scan of a bundle just wheeled in from the van. Hunching forward, cardiologists Gregory Thomas of Long Beach Memorial Medical Center in California and Randall C. Thompson of the University of Missouri School of Medicine–Kansas City scrutinize an anatomical landscape rendered strangely foreign by centuries of decay and desiccation. As the technician scrolls up and down the image, the two cardiologists gradually pick out preserved soft tissue and the snaking trails of major arteries. The relief in the room is palpable. Then, unable to resist, the two cardiologists take a quick preliminary look along the arteries for small, dense, white patches—calcified plaque that signals an advanced stage of atherosclerosis, or hardening of the arteries, the leading cause of fatal heart attacks and strokes. The individual has clearly calcified arteries.

Cardiologists have traditionally regarded atherosclerosis as a disease of modern civilization. Contemporary behaviors such as smoking cigarettes, eschewing exercise, dining on high-calorie diets and packing on the pounds are all known to increase the risk of this disease. Moreover, several recent studies point to an emerging atherosclerosis epidemic in the developing world, as societies there grow more affluent and increasingly embrace a modern, Western lifestyle. Yet in 2010 Thomas and a group of his colleagues decided to test the idea that atherosclerosis is a disease of modern, affluent life by taking CT scans of ancient human mummies and examining their arteries.

The team started in Egypt, with 52 mummies dating between 3,500 and 2,000 years ago. Biological anthropologist Muhammad Al-Tohamy Soliman of the National Research Center in Giza estimated the age at death for each individual, based on an examination of dental and skeletal development. Then the medical team pored over the scans. Discussing the images during weekly Skype calls, they identified cardiovascular tissue in nearly 85 percent of the mummies. To their surprise, 45 percent of these had definite or probable atherosclerosis—clear evidence that one ancient population suffered from the disease. “We were [also] a bit surprised by just how much atherosclerosis we found in ancient Egyptians who were young,” recalls team member James Sutherland, a radiologist at the South Coast Radiological Medical Group in Laguna Hills, Calif. “The average age of death was around 40.”

When their paper came out in the Journal of the American College of Cardiology in the spring of 2011, Finch contacted the team immediately, proposing a new explanation for the high levels of atherosclerosis detected in the study. The ancient Egyptians, Finch noted, were no strangers to pestilence and infection. Previous studies showed that many ancient Egyptians were exposed to a wide range of infectious diseases, including malaria, tuberculosis and schistosomiasis (an ailment caused by tiny parasitic worms found in contaminated water). APOE e4 carriers, with their enhanced immune systems, tended to survive many childhood infections. But they experienced decades' worth of chronic high levels of inflammation in the pathogen-rich environment—levels that are now linked to several deadly diseases of old age, including atherosclerosis and Alzheimer's. Indeed, the arterial plaques that characterize atherosclerosis seem to accumulate during inflammation and wound healing in the vascular wall. “And while it might be pushing it to say the senile plaques of Alzheimer's are some form of scab, like the plaques on artery vessels, they have many of the same components,” Finch suggests.

Thomas and his colleagues asked Finch to join their team. Together they decided to gather more data, examining the cardiovascular tissues of ancient mummies from a wide range of cultures. The Egyptians in their first study likely came from affluent upper classes that could afford mummification: such individuals may have exercised rarely and dined frequently on high-calorie foods. So the team expanded the study to other, very different cultures. They examined existing CT scans of ancestral Puebloan mummies from Utah and century-old Unangan mummies from Alaska. In addition, they analyzed the scans they had taken of pre-Hispanic mummies from coastal Peru. Those individuals dated to as early as 1500 b.c.

In March 2013 the team published its findings in the Lancet. Among the 137 examined mummies, 34 percent had probable or definite atherosclerosis. Significantly, the scans revealed the disease in all four ancient populations, including the hunting-and-gathering Unangan people, who ate a largely marine diet. The findings clearly challenged the idea that atherosclerosis was a modern disease and pointed to another explanation. “The high level of chronic infection and inflammation in premodern conditions might have promoted the inflammatory aspects of atherosclerosis,” the team wrote.

Perhaps, Finch says, the ancient gene variant that ramped up our inflammatory response and boosted the chances of our survival to the age of reproduction—APOE e4—came with a steep, deferred cost: heart attacks, strokes, and other chronic diseases of aging. In fact, APOE e4 appears to be a classic case of something biologists call antagonistic pleiotropy, in which a gene has a strong positive effect on the young and an adverse impact on the old. “I think these are very intriguing ideas,” says Steven N. Austad, a biologist and gerontologist at the University of Alabama at Birmingham. “And what evidence we have supports them.”

Refining immune response
Research also points to other gene variants that contributed to our longevity. At roughly the same time that H. sapiens emerged in Africa some 200,000 years ago, a second major APOE variant emerged. This allele, known as APOE e3, enhanced health among adults in the 40- to 70-year-old range and helped to slow the aging process, and today it has a prevalence of between 60 and 90 percent in human populations. As Finch points out, APOE e3 carriers produce a less vigorous inflammatory response than those with the ancestral variant. Moreover, they appear better adapted to meat- and fat-rich diets. Generally speaking, they have lower blood cholesterol and are less prone to the diseases that strip the old of their vitality: coronary heart disease, cognitive decline and Alzheimer's. Indeed, carriers of the more recent variant enjoy life expectancies as much as six years longer than their APOE e4 neighbors. “APOE e3,” Finch notes, “may have been a factor in the evolution of long life spans.”

APOE is not the only gene linked to the evolution of human longevity, however. At U.C. San Diego, Ajit Varki, a professor of medicine, and his colleagues are investigating several other genes that may have undergone changes that boosted our chances of survival and extended our lives. Varki's research focuses on the SIGLEC genes that play key roles related to host defense. These genes express proteins that straddle our cell membranes and act a little like sentries. Their function “is to recognize friends, not foes,” Varki explains. It is no easy matter. To fool these sentries, infectious pathogens evolve camouflage consisting of proteins that mimic those borne by “friends.”

In 2012 Varki and his team published a study in the Proceedings of the National Academy of Sciences USA that identified two key changes in these genes that dated to at least 200,000 to 100,000 years ago and that honed our ability to fight off pathogens. One change produced a new human variant of the ancestral primate gene SIGLEC 17. This variant, however, was nonfunctional. A second event deleted the ancestral gene SIGLEC 13 entirely. To better understand these changes, Varki and his colleagues experimentally resurrected the proteins once expressed by SIGLEC 13 and 17. Both ancestral proteins, they discovered, had been “hacked” by pathogens responsible for two life-threatening infections in babies: group B Streptococcus and E. coli K1. So as natural selection began weeding out these compromised genes from our genome, the odds of survival rose in human infants.

Such findings add new fuel to the hypothesis that pumped-up immune systems played a key role in lengthening human lives. “Our immune systems went through a lot of changes,” Varki says. And as geneticists and biologists continue to investigate the uniquely human part of our genome, many are starting to look for other genetic variants and events that contributed to our long lives today.

Yet already the findings are giving some researchers pause for thought. Public health messages have long warned that lifestyle choices such as couch-potato evenings and calorie-rich diets are largely to blame for the high incidence of atherosclerosis, heart attacks and strokes. But the new research—particularly the studies on ancient mummies—suggests that the picture may not be quite so simple. Our DNA and an overcharged immune system may well contribute to the development of such diseases. “So maybe we have a little less control over atherosclerosis than we thought,” muses cardiologist Thompson. “Maybe our mental framework should be shifted.” And perhaps, he adds, researchers should be looking for undiscovered risk factors.

The new findings are also raising a fundamental question about human longevity. Can we, or should we, expect the trend toward longer lives to continue? Some scientists have predicted that babies born after 2000 in countries where life expectancy had already been high—including the U.S., Canada, the U.K. and Japan—will live to 100 years of age. Finch is quietly skeptical, however. The emerging trend toward obesity in many human populations and toward environmental deterioration brought about by climate change, he says, could well affect human longevity negatively and throw a major wrench into the works. “I think there is a reason to be cautious about that,” Finch concludes. “But time will tell.”