More stories

  • in

    Neanderthals were probably maggot-munchers, not hyper-carnivores

    Maggots in rotting meat could have been an important part of ancient dietsChronicle/Alamy
    Neanderthals may not have been the hyper-carnivores we thought they were. It has been claimed, based on the nitrogen isotope ratios in their bones, that our ancient relatives ate little besides meat. But these ratios can also be explained by a more balanced omnivorous diet that included a lot of maggots, as well as plant-based food.
    “Masses of maggots are these easily scoopable, collectible, nutrient-rich resource,” says Melanie Beasley at Purdue University, Indiana.
    Advertisement
    There is lots of evidence that they were routinely eaten in many societies in the past, and they are still consumed in some places today, she says. Some reindeer hunters regard certain maggots as a treat that they actively cultivate, for instance, while casu martzu, a cheese that contains live maggots, is a delicacy in Sardinia.
    Nitrogen has two stable isotopes, nitrogen-14 and nitrogen-15. The lighter isotope is more likely to be lost from living organisms than the heavy one, so, as matter moves up food chains, the ratio of nitrogen-15 to nitrogen-14 increases.
    Looking at the isotope ratios in collagen inside fossil bones can therefore tell us about the diet of those animals, with carnivores having higher ratios than herbivores. But when researchers started looking at the ratios in the bones of Neanderthals, they found something surprising: even higher ratios than those seen in lions and hyenas. “So there became this narrative that Neanderthals were these hyper-carnivores very focused on big game hunting,” says Beasley.

    But many researchers don’t buy this idea. For one thing, the bones of Homo sapiens living in prehistoric times have similar ratios – and these humans couldn’t have survived on lean meat alone. “It’s actually physically not possible,” says Beasley. “You’ll die of what early explorers called ‘rabbit starvation’.”
    The issue is that if a person’s diet is too rich in protein, their body can’t mop up all the toxic breakdown products, such as ammonia.
    There is also now plenty of direct evidence that Neanderthals did eat plants, too, for instance from studies of their dental calculus. So why were their nitrogen-15 ratios so high?
    Back in 2017, John Speth at the University of Michigan suggested it could be because Neanderthals stored meat and ate it later in a rotten state. As meat rots, gases like ammonia are given off, which should result in nitrogen-15 enrichment.
    At the time, Beasley was applying to do research at the “body farm” at the University of Tennessee where human cadavers are studied as they decay to help with crime scene analysis. She realised she could test Speth’s idea alongside the forensic research – and while she was at it, she also looked at the maggots in the bodies.
    Together with Speth and Julie Lesnik at Wayne State University, Michigan, Beasley found that nitrogen isotope ratios do increase as muscle tissue rots, but only by a modest amount. There is, however, a much bigger jump seen in the maggots of various kinds feeding on the corpses.

    These are just initial results, but they show that eating a diet very high in meat isn’t the only possible explanation for the isotope ratios in Neanderthals and ancient Homo sapiens, says Beasley. She thinks those ratios are probably due to a combination of factors – the storage, processing and cooking of meat, as well as the consumption of maggots.
    “This is an exciting new study, and I think it goes a long way toward making sense of the strange results that have come out of isotope studies in Neanderthals and other Stone Age hominins over the last couple of decades,” says Herman Pontzer at Duke University in North Carolina.
    “I find the evidence here pretty convincing, that consumption of maggots and similar larvae explains the ‘hyper-carnivore’ signal we’ve been seeing in previous fossil isotope work,” he says.
    The work also adds to the evidence that a so-called palaeo diet should include rotten meat and maggots, says Beasley. “All the people who want to go true ‘palaeo’, they need to start thinking about fermenting their meat and letting the flies access them.”

    Neanderthals, ancient humans and cave art: France

    Embark on a captivating journey through time as you explore key Neanderthal and Upper Palaeolithic sites of southern France, from Bordeaux to Montpellier, with New Scientist’s Kate Douglas.

    Find out more

    Topics:Neanderthals More

  • in

    Homo naledi’s burial practices could change what it means to be human

    Shutterstock/vyasphoto
    From a young age, the inevitability and finality of death becomes a shaping force in our lives. Indeed, it could be said that our ability to recognise our eventual demise and the grief that comes with losing those close to us are core elements of what it means to be human. They have also led to symbolic practices that have deep roots in human culture.
    We have long assumed that Homo sapiens was the only human species to have gained an awareness of the mortality of living things. But as we report in “What were ancient humans thinking when they began to bury their dead?”, archaeologists are eager to question the idea that a deep and emotional response to death is our sole preserve.
    The most challenging of their claims is that ancient humans who were very unlike us developed death rituals. But evidence is mounting that Homo naledi, an ancient human from southern Africa with a brain one-third the size of your own, buried its dead at least 245,000 years ago. Exactly why these small-brained humans may have felt compelled to develop a culture of death is unclear, but one intriguing – if speculative – idea is that they did so to help youngsters come to terms with the loss of a group member.Advertisement
    Much controversy surrounds the claim that H. naledi buried its dead, largely concerning the quality of the evidence. But since the mid-20th century, researchers have been busily narrowing the behavioural gap between our species and others, spearheaded by research showing that many animals have emotionally rich lives. Some even develop their own rituals when confronted with the death of community members. Throw in evidence that our ancestors were developing their own artistic culture at least 500,000 years ago and it is easier to accept that H. naledi was capable of developing its own burial traditions.

    Archaeologists are questioning whether a deep response to death is our sole preserve

    The provocative image of a grief-stricken H. naledi helping its young deal with loss forces as much of a rethinking about these ancient relatives as it does a reckoning of what it means to be human – and whether we are as special as we like to think.

    Topics: More

  • in

    Triumphant images of women who climbed to new heights

    Ines Papert in Kyrgystan,Ines Papert
    Most people would find a 1200-metre wall of ice on a mountain peak intimidating. But for decorated ice climber Ines Papert, scaling the peak of Kyzyl Asker – a remote mountain on the border between China and Kyrgyzstan – was a dream. It took three attempts before she and fellow climber Luka Lindič summitted it in 2016 (pictured above), becoming the first known people to climb a precipitous route the pair dubbed “Lost in China”.
    Papert is one of more than a dozen female mountaineers whose daring expeditions to the world’s greatest peaks are featured in Mountaineering Women: Climbing through history by Joanna Croston.
    Elizabeth “Lizzie” Le BlondThe Martin and Osa Johnson Safari Museum, Chanute, KansasAdvertisement
    Another is mountaineer Elizabeth “Lizzie” Le Blond, photographed climbing a mountain in the Swiss Alps in 1889 in a full skirt (pictured above). Le Blond, who made 20 record-breaking ascents, also helped form the Ladies’ Alpine Club in 1907 to offer support to female mountaineers in this male-dominated sport.
    Lydia Bradey on the first female ascent of Zenith, Halfdome, Yosemite National ParkSteve Monks
    Croston’s book also features Lydia Bradey, who was the first woman to climb several routes in California’s Yosemite National Park in the 1980s. Shown above, she is pictured midway up a route on the iconic face of Half Dome. In 1988, she became the first woman to summit Mount Everest without supplementary oxygen. The Tibetan name for Everest is Qomolangma, which means “goddess mother of the world”.
    Mountaineering Women: Climbing through history will be released in the UK on 7 August and internationally on 16 September.

    Topics:photography More

  • in

    AI helps reconstruct damaged Latin inscriptions from the Roman Empire

    A Roman temple in Ankara, TurkeyPE Forsberg / Alamy Stock Photo
    Latin inscriptions from the ancient world can tell us about Roman emperors’ decrees and enslaved people’s thoughts – if we can read them. Now an artificial intelligence tool is helping historians reconstruct the often fragmentary texts. It can even accurately predict when and where in the Roman Empire a given inscription came from.
    “Studying history through inscriptions is like solving a gigantic jigsaw puzzle, only this is tens of thousands of pieces more than normal,” said Thea Sommerschield at the University of Nottingham in the UK, during a press event. “And 90 per cent of them are missing because that’s all that survived for us over the centuries.”
    Advertisement
    The AI tool developed by Sommerschield and her colleagues can predict a Latin inscription’s missing characters, while also highlighting the existence of inscriptions that are written in a similar linguistic style or refer to the same people and places. They named the tool Aeneas in honour of the mythical hero, who, according to legend, escaped the fall of Troy and became a forebear of the Romans.
    “We enable Aeneas to actually restore gaps in text where the missing length is unknown,” said Yannis Assael at Google DeepMind, a co-leader in developing Aeneas, during the press event. “This makes it a more versatile tool for historians, especially when they’re dealing with very heavily damaged materials.”
    The team trained Aeneas on the largest ever combined database of ancient Latin texts that machines can interact with, including more than 176,000 inscriptions and nearly 9000 accompanying images. This training allows Aeneas to suggest missing text. What’s more, by testing it on a subset of inscriptions of known provenance, the researchers found that Aeneas could estimate the chronological date of inscriptions to within 13 years – and even achieve 72 per cent accuracy in identifying which Roman province an inscription came from.

    “Inscriptions are one of our most important sources for understanding the lives and experiences of people living in the Roman world,” says Charlotte Tupman at the University of Exeter in the UK, who wasn’t involved in the research. “They cover a vast number of subject areas, from law, trade, military and political life to religion, death and domestic matters.”
    Such AI tools also have “high potential to be applied to the study of inscriptions from other time periods and to be adapted for use with other languages,” says Tupman.
    During testing with inscriptions that were deliberately corrupted to simulate damage, Aeneas achieved 73 per cent accuracy in restoring gaps of up to 10 Latin characters. That fell to 58 per cent accuracy when the total missing length was unknown – but the AI tool shows the the logic behind the suggestions it makes so researchers can asses the validity of the results.
    When nearly two dozen historians tested the AI tool’s ability to restore and attribute deliberately corrupted inscriptions, historians working with the AI outperformed either historians or AI alone. Historians also reported that comparative inscriptions identified by Aeneas were helpful as potential research starting points 90 per cent of the time.
    “I think it will speed up the work of anyone who works with inscriptions, and especially if you’re trying to do the equivalent of constructing wider conclusions about local or even empire-wide patterns and epigraphic habits,” says Elizabeth Meyer at the University of Virginia. “At the same time, a human brain has to look at the results to make sure that they are plausible for that time and place.”

    “Asking a general-purpose AI model to assist with tasks in ancient history often leads to unsatisfactory results,” says Chiara Cenati at the University of Vienna in Austria. “Therefore, the development of a tool specifically designed to support research in Latin epigraphy is very welcome.”
    The “dream scenario” is to enable historians “to have Aeneas at your side in a museum or at an archaeological site”, said Sommerschield at the press event. Aeneas is now freely available online.

    Topics: More

  • in

    What were ancient humans thinking when they began to bury their dead?

    Westend61 GmbH/Alamy
    Some people will tell you that Homo naledi was a small-brained hominin with some big thoughts. Two years ago, a team led by Lee Berger at the University of the Witwatersrand, South Africa, concluded that H. naledi – a species that lived around 335,000 to 245,000 years ago and had a brain about one-third the size of yours – invented a complex ritual that involved burying its dead in a deep and difficult-to-access cave chamber.  
    This idea didn’t go down well: all four of the anonymous researchers asked to assess its merit were sceptical. But Berger and his colleagues were undeterred. Earlier this year, they published an updated version of their study, offering a deeper dive into the evidence they had gathered from the Rising Star cave system in South Africa. The approach paid off: two of the original reviewers agreed to reassess the science – and one was won over. “You rarely see that in peer review,” says John Hawks at the University of Wisconsin-Madison, a member of Berger’s team.  
    Many other researchers, however, are still wary. “I’m just not convinced by any of it,” says Paul Pettitt at Durham University, UK. To appreciate why, it is necessary to explore how other ancient hominins interacted with the dead. Doing so can help us figure out which species carried out burials, how ancient the practice is and what it says about the minds and motivations of those doing it. Considering this also reveals why, if H. naledi really did bury its dead, that would fundamentally challenge our understanding of early hominin cognition and behaviour.  

    There is one archaeological site that has much in common with Rising Star: Sima de los Huesos (the “pit of bones”) in northern Spain. There, researchers have uncovered the remains of 29 hominins, thought to be an ancestor of Neanderthals, at the bottom of a vertical shaft within a cave. The consensus is that the Sima hominins, who lived between 430,000 and 300,000 years ago, died elsewhere and that their bodies were then dropped into the pit. If so, this represents the oldest clear evidence for some sort of funerary behaviour.   
    Such an early date may seem surprising, but in context, it makes sense. We know that chimpanzees show an interest in dead group members, grooming their fur and even cleaning their teeth. “If we have chimpanzees behaving this way, then we might expect similar behaviour deep in our evolutionary past,” says Pettitt. However, the funerary behaviour on show at Sima appears more sophisticated than anything chimps do, says María Martinón-Torres at the National Human Evolution Research Centre (CENIEH) in Spain. “They have chosen a place to put the dead.” What’s more, the excavation also unearthed a stone hand axe, which is sometimes interpreted as a funerary offering – although it could simply have been in a pouch worn by one of the hominins found there, says Pettitt.  

    Such elaborate treatment of the dead may have been evolutionarily beneficial. At some point in prehistory – perhaps when brains reached a certain size – hominins must have become aware of their own mortality, says Pettitt. In a 2023 paper, he suggested that complex funerary behaviour might then have arisen to mitigate personal anxiety about death by bringing the community together when a group member died. This scenario could explain what happened at Sima, given that the average brain size of these hominins was 1237 cubic centimetres – only about 100 cubic centimetres less than the average modern human.  
    The idea that members of Homo naledi buried their dead is contentious because their brains were so smallImago/Alamy
    Others see something more sinister at Sima. Mary Stiner at the University of Arizona points out that many of the skeletons are from adolescents or young adults. “That’s an age group in which individuals choose to take risks and are more vulnerable due to low experience,” she says. Moreover, there are signs on the bones that some of the Sima hominins died violently. Stiner thinks the skeletons may represent youngsters who left their family group, strayed into hostile territory and came to a grisly end – their bodies tossed into the pit by their killers, perhaps to hide the evidence. But as Pettitt points out, that would require an unusually large number of adolescents making the same mistakes and meeting a similar fate.   
    For now, it is difficult to know exactly how to interpret the Sima site. Fortunately, more evidence may soon be available. Since 2021, Nohemi Sala at CENIEH and her colleagues have been exploring the archaeological record of funerary behaviour through a project known as DEATHREVOL. Sala says the research suggests that there are other similarly ancient sites in Europe that may preserve evidence of the same behaviour recorded at Sima – although she won’t name them until the work is published. “There are four or five candidates to explore these patterns,” she says. “It’s more than just Sima.”  
    Neanderthal burials
    Eventually, hominins like those at Sima gave rise to the Neanderthals, who had different ways of treating the dead. Some of the clearest evidence for this comes from Shanidar cave, a site in northern Iraq where, since the mid-20th century, the remains of at least 10 Neanderthals have been discovered. The oldest dates back about 75,000 years, making it among the oldest known Neanderthal burials. Another set of remains was pivotal to us recognising in the late 20th century that Neanderthals shared our humanity, because pollen around this individual’s bones suggested that they had been buried with flowers. Today, while nobody doubts Neanderthals’ humanity, few archaeologists buy the “flower burial” idea. Recent excavations at Shanidar point to an alternative explanation for the pollen. Chris Hunt at Liverpool John Moores University, UK, and his colleagues think the body may have been placed in the ground and then buried under a pile of brushwood rather than dirt. They note that some of the pollen around the skeleton comes from plants with prominent spikes, possibly added to deter scavengers.   
    Nevertheless, the Shanidar burials are revealing. One was of a man who managed to live with severe injuries to his face, shoulder and arm. Stiner is among several researchers who think he would have required help to do so, suggesting Neanderthals cared for and valued each other as individuals. If they did, then death wasn’t merely the loss of a pair of hands for sourcing food; it was the loss of someone with a unique personality who would be missed – leading to a new motivation behind funerary behaviour. “These societies were bound by love and affection,” says Stiner.  
    Five of the skeletons at Shanidar hint at something else. They were all buried in the same spot in the shadow of a prominent landmark – a 2-metre-high rock inside the cave – over the course of a few decades to a few millennia. Hunt and his colleagues think this might be a sign that Neanderthals tied meaning to landmarks in their environment. More speculatively, burying the dead here may even have played a role in legitimising the right of the living to the nearby land and its resources. Our species can have that sort of relationship with land, says Emma Pomeroy at the University of Cambridge, who was also involved in the recent excavations at Shanidar. “I think it’s very interesting to think about whether Neanderthals had a similar attitude to the landscape.”  
    Shanidar cave in Iraq contains some of the oldest and most convincing Neanderthal burialsMatyas Rehak/Alamy
    Mysteries remain. A big one is why only a few of the Neanderthals who lived around Shanidar were buried in the cave. “If this was something that hominins did a lot, the caves would be chock-a-block with bodies,” says Hawks. Evidence from elsewhere indicates that other Neanderthal deaths may have been honoured with different funerary treatments, including ritual cannibalism – but for some as-yet-unfathomable reason, very few Neanderthals ended up interred in the cave. Another question is whether Neanderthals devised the idea of burial themselves or learned it from our species, Homo sapiens, whom they met around the time of the Shanidar burials.  
    What we do know is that our species began burying its dead around 120,000 to 100,000 years ago. And some early H. sapiens burials appear to differ from those of Neanderthals by the inclusion of grave goods. For instance, a body in Qafzeh cave in Israel appears to have been buried with red deer antlers clasped to its chest – although other interpretations are possible. “Perhaps the antler was used to dig the grave and it’s just a fortuitous association,” says Pettitt. We don’t know how common early grave goods were, in part because human burials were so rare before 28,000 years ago. Neither do we know their exact significance, although in later burials they are generally seen as reflecting things like the status and occupation of the deceased.  
    The graves of young children
    Rare though they are, early human burials reveal intriguing signs of a pattern. In 2021, a team including Martinón-Torres and Michael Petraglia, now at Griffith University, Australia, described an excavation at Panga ya Saidi cave in Kenya in which they had unearthed the 78,000-year-old burial of a toddler they named Mtoto. The researchers noted that Mtoto is the earliest of three potential H. sapiens burials in Africa, which date to between 78,000 and 68,000 years ago. All three involved young children.   
    Childhood mortality was probably relatively high in these early communities, says Petraglia. “We don’t have the evidence to say for sure, but we suspect so because childhood mortality is pretty high in hunter-gatherer societies.” Even so, some children’s deaths might have been “particularly painful”, says Martinón-Torres, motivating early communities to commemorate them with what was, at the time, an unusual funerary ritual: burial. Pettitt has explored this idea. He distinguishes “good deaths”, which usually occur in old age, from “bad deaths”, which occur unexpectedly and often involve children. The latter may have provided an impetus for people to perform special funerary rites, he suggests, which might help explain burials like Mtoto’s.  
    Another clue to the thinking of these Stone Age people comes from the fact that Panga ya Saidi cave was a place of human habitation on and off for thousands of years. This suggests a decision was made to inter Mtoto’s small body in close proximity to the community’s living space. “If you bury someone you love, in a way, you don’t want them to go,” says Martinón-Torres. Placing them in an easy-to-visit location may help maintain a close connection, she adds.  

    So, what does all this tell us about whether H. naledi buried its dead?   
    There are certainly echoes of other sites in Rising Star. The idea that, hundreds of thousands of years ago, hominins placed their dead deep inside a cave draws parallels with Sima de los Huesos. The suggestion that H. naledi repeatedly returned to the same site to inter bodies seems to mirror the situation at Shanidar cave. And the discovery of a crescent-shaped stone near the fossilised hand of one H. naledi skeleton – a possible grave good – looks like behaviour seen at sites like Qafzeh.  
    But the burial hypothesis also seems all wrong. The biggest stumbling block is the size of H. naledi’s brain, which, at an average of 513 cubic centimetres, was tiny. For a start, it raises doubts about whether individuals really were aware of their own mortality, inventing elaborate funerary rituals to come to terms with this revelation. There is also no evidence yet that the species cared for its sick, a potential sign that group members were valued as individuals whose deaths were mourned. And although youngsters are overrepresented in the Rising Star cave – potentially consistent with Pettitt’s “bad death” idea – the chamber in which the bones were found doesn’t seem to be an easy-to-visit location that would allow the living to maintain a connection with the dead. “It’s quite anomalous, but also fascinating,” says Stiner.  
    Red deer antlers in a grave at Qafzeh, Israel, may have a symbolic meaningUniversal History Archive/Shutterstock
    There are two ways to interpret this puzzle. One is to look for non-burial scenarios that could explain the accumulation of the H. naledi skeletons. For instance, in 2021, researchers reported finding the remains of 30 baboons, nine of them mummified, in a cave chamber in South Africa. It seems that the primates had used the cave as a sleeping site over many years, with some occasionally dying there and their bodies gradually accumulating. Perhaps H. naledi used Rising Star in a similar way. “We need to consider whether that might be a factor,” says Pomeroy.   
    The other, more radical, option is to ask whether our understanding of how and why hominins developed funerary traditions requires a rethink. “Spirituality, the idea of self-awareness and mortality – all could have arisen many times independently,” says Berger. Hawks points out that analysis of H. naledi skeletons suggests that, like us, they had a long childhood – and that could be the key. “Extended childhoods have an adaptive purpose: they enable kids to integrate into social groups in a way that isn’t sexually competitive,” he says. They may also have encouraged members of H. naledi to develop funerary customs to help their youngsters understand the death of group members. “We have funerals to explain to kids what just happened,” says Hawks.   
    Unfortunately, gathering evidence to confirm the burial idea is more difficult than it might seem. Talk of burial may conjure up images of modern cemeteries, but Stone Age graves aren’t like that. “They’re not 6 feet under in well-constructed holes,” says Hawks: the oldest burial pits were usually shallow depressions in the floor. If hominins then returned to inter more dead, they could easily disturb earlier graves and create a jumble of bones that is difficult to interpret as a set of burials.   

    The good news, say Berger, Hawks and their colleagues, is that there is plenty more untouched material at Rising Star, which could, in the future, strengthen their burial hypothesis. If they can do that, they may well find a surprisingly receptive audience. As we have seen, ancient burials are open to interpretation, conclusions are provisional and many of the archaeologists working on these sites would like nothing more than new discoveries that challenge their ideas about the prehistory of funerary behaviour.   
    “It’s sometimes suggested that the scientific community just doesn’t want to believe that a small-brained hominin would be capable of symbolic treatment of the dead,” says Pomeroy. “That couldn’t be further from the truth. We’d be so excited – if there was good evidence.” 

    Topics:human evolution/ancient humans More

  • in

    Neanderthal groups had their own local food culture

    An illustration of a Neanderthal group preparing foodLUIS MONTANYA/MARTA MONTANYA/SCIENCE PHOTO LIBRARY
    Neanderthals may have had traditional ways of preparing food that were particular to each group. Discoveries from two caves in what is now northern Israel suggest that the residents there butchered the same kinds of prey in their own distinctive ways.
    Modern humans, or Homo sapiens, weren’t the first hominins to prepare and cook food. There is evidence that Neanderthals, for example, which inhabited Europe and Asia until about 40,000 years ago, used flint knives to butcher what they caught, cooked a wide range of animals and spiced up their menu with wild herbs.

    To learn more about Neanderthal food culture, Anaëlle Jallon at the Hebrew University of Jerusalem and her colleagues examined evidence at the caves of Amud and Kebara in northern Israel.
    These sites, which are just some 70 kilometres apart, provide a unique opportunity to examine local cultural differences. Stone tools, food remains and hearths found at each site reveal that Neanderthals occupied both caves, probably during winters, during the same time period.
    “You find the same species of animals to hunt and it’s more or less the same landscape,” says Jallon. “It will be the same kind of weather, and Neanderthals at both ate mostly gazelles and some fallow deer that they complemented with a few bigger animals like boar or aurochs.”

    There are a few differences, though. For example, bones reveal that a greater amount of large prey was hunted at Kebara, and more kills were carried back to that cave to be butchered.
    Jallon and her colleagues used microscopes to inspect bones from layers of sediment at the two sites from between 50,000 and 60,000 years ago, examining the cuts slashed in them with stone tools.
    They found that even though the flint tools used were similar at both sites, the patterns of cuts were different. “The cuts tend to be more variable in their width and depth in Kebara, and in Amud they are more concentrated in big clusters and they overlap each other more often,” says Jallon.
    To assess if the differences could be down to butchering different prey, the researchers also looked specifically at long bones from gazelles found at both sites. These had the same differences.
    “We are talking about two groups who live very close and, let’s say, both cutting up some beef – but in one site they seem to be cutting closer to the bone, getting all the meat off,” says Ceren Kabukcu at the University of Liverpool, UK.
    Previous research that looked at cut marks on bones from more recent societies suggests that the kind of variation seen in Neanderthal butchery isn’t down to a lack of expertise, but to a difference in technique.

    Jallon thinks the contrast is best explained by deliberate butchery choices. It could be that Neanderthals at Amud made their meat harder to process by, for example, drying it or letting it hang before cooking, she says, which would have meant they needed more cuts to get through it or a larger team of people to butcher the meat.
    “In behaviour that is as opportunistic as butchering, you would expect to find the most efficient way to butcher something to get the most out of it, but apparently, it was more determined by social or cultural factors,” says Jallon. “It could be due to group organisation or practices that are learned and transmitted from generation to generation.”
    “The fact that there might be differences and some nuance on how technology is used in daily life is not entirely shocking,” says Kabukcu. “I think as this question is investigated, we might see more and more nuance at several sites of the Middle Palaeolithic.”
    It isn’t known whether the caves were occupied at the same time or if disparate groups might have been in contact with each other. “It is a possibility that it was at the same exact time, but it’s also possible it was hundreds of years apart or more. We don’t have the resolution to know that,” says Jallon.
    But she also says that the pattern of very clustered cut marks found in Amud is similar in the oldest layer and in the younger layers, so she says the cave might have been used by returning groups that maintained the same butchery traditions for centuries.

    Topics:Neanderthals/ancient humans More

  • in

    Provocative new book says we must persuade people to have more babies

    A large population may enable innovation and economies of scalePHILIPPE MONTIGNY/iStockphoto/Get​ty Images
    After the SpikeDean Spears and Michael Geruso (Bodley Head (UK); Simon & Schuster (US))
    Four-Fifths of all the humans who will ever be born may already have been born. The number of children being born worldwide each year peaked at 146 million in 2012 and has been falling overall ever since. This means that the world’s population will peak and start to fall around the 2080s.
    This fall won’t be gradual. With birth rates already well below replacement levels in many countries including China and India, the world’s population will plummet as fast as it rose. In three centuries, there could be fewer than 2 billion people on Earth, claims a controversial new book.
    “No future is more likely than that people worldwide choose to have too few children to replace their own generation. Over the long run, this would cause exponential population decline,” write economists Dean Spears and Michael Geruso in After the Spike: The risks of global depopulation and the case for people.
    This, you might think, could be a good thing. Won’t it help solve many environmental issues facing us today? No, say the authors. Take climate change: their argument isn’t that population size doesn’t matter, but that it changes so slowly that other factors such as how fast the world decarbonises matter far more. The window of opportunity for lowering carbon dioxide emissions by reducing population has largely passed, they write.
    Spears and Geruso also make the case that there are many benefits to having a large population. For instance, there is more innovation, and economies of scale make the manufacture of things like smartphones feasible. “We get to have nice phones only because we have a lot of neighbors on this planet,” they write.
    So, in their view, our aim should be to stabilise world population rather than letting it plummet. The problem is we don’t know how, even with the right political will.

    As we grow richer, we are more reluctant to abandon career and leisure opportuntiies to have children

    While some government policies have had short-term effects, no country has successfully changed long-term population trends, argue the authors. Take China’s one-child policy. It is widely assumed to have helped reduce population growth – but did it? Spears and Geruso show unlabelled graphs of the populations of China and its neighbours before, during and after the policy was in place, and ask the reader which is China. There is no obvious difference.
    Attempts to boost falling fertility rates have been no more successful, they say. Birth rates jumped after Romania banned abortion in 1966, but they soon started to fall again. Sweden has tried the carrot rather than the stick by heavily subsidising day care. But the fertility rate there has been falling even further below the replacement rate.
    All attempts to boost fertility by providing financial incentives are likely to fail, Spears and Geruso argue. While people might say they are having fewer children because they cannot afford larger families, the global pattern is, in fact, that as people become richer they have fewer children.
    Rather than affordability being the issue, it is more about people deciding that they have better things to do, the authors say. As we grow richer, we are more reluctant to abandon career and leisure opportunities to have children. Even technological advances are unlikely to reverse this, they say.
    On everything other than the difficulty of stabilising the population, this is a relentlessly optimistic book. For instance, say the authors, dire predictions of mass starvation as the world’s population grew have been shown to be completely wrong. The long-term trend of people living longer and healthier lives can continue, they suggest. “Fears of a depleted, overpopulated future are out of date,” they write.
    Really? Spears and Geruso also stress that the price of food is key to determining how many go hungry, but fail to point out that food prices are now climbing, with climate change an increasing factor. I’m not so sure things are going to keep getting better for most people.
    This book is also very much a polemic: with Spears and Geruso labouring their main points, it wasn’t an enjoyable read. That said, if you think that the world’s population isn’t going to fall, or that it will be easy to halt its fall, or that a falling population is a good thing, you really should read it.

    New Scientist book club

    Love reading? Come and join our friendly group of fellow book lovers. Every six weeks, we delve into an exciting new title, with members given free access to extracts from our books, articles from our authors and video interviews.

    Topics: More

  • in

    Evolution has made humans both Machiavellian and born socialists

    David Oliete
    Nearly 2 million years ago, one of our hominin ancestors developed bone cancer in their foot. The fate of this individual is unknown, but their fossilised remains leave no doubt that cancer has been a part of our story for a very long time. It is also clear that, when threatened by our own normally cooperative cells turning against us, we evolved an immune system to help us identify and deal with the enemy within.
    But treacherous cancer cells weren’t the only internal threat our ancestors faced. As hypersocial beings, their survival was also jeopardised by selfish individuals attempting to subvert the group – and capable of unravelling society, just as a cancer eventually kills its host. I am interested in understanding how we adapted to this threat. At the heart of the story is this question: is human nature selfish or altruistic, competitive or cooperative? Are we essentially cancers, tamed by culture, or more like healthy cells in the human body, working together for mutual success?
    People have debated this for centuries and continue to do so, citing research in primatology, anthropology, psychology and economics to defend their points. The answer has profound implications for how we aim to structure society. If we are born altruists, then institutions and hierarchies are unnecessary. But if selfishness prevails, strong control is essential. To me, both extremes are unconvincing. To understand why, we must appreciate the circumstances under which humanity has evolved. Determining how our ancestors confronted brutish selfishness doesn’t just tell us about our own social past – it can also help us inoculate today’s societies against the threat from within.

    Look at the animals to which we are most closely related and you see that each species has its own distinct set of social structures. Among gorillas, for example, groups of unmated males are typically led by aggressive alpha males. Mated gorillas sometimes live in groups of males and females, but more often it is the stereotypical silverback with a harem of females – a group that has its own hierarchy. Chimpanzees and bonobos also display dominance hierarchies, with a lot of emphasis placed on female social rank, particularly among bonobos. Despite the wide variation in sociality among these species, one thing is consistent: where there is social rank, aggressive dominance is the winning attribute. If an alpha can successfully defend resources, whether territory, food or mates, it can dominate a primate society. Access to more resources translates into having more surviving offspring than others, which is the only measure of success for evolution by natural selection.
    Human self-domestication
    Among our own ancestors – members of the genus Homo – the story is different. Research in anthropology and primatology suggests that, as early people evolved more complex social structures, they did something unseen in other primates: they domesticated themselves. Once they had the cognitive sophistication necessary to create weapons, along with the intelligence required to form alliances, they could fight the large, angry dominants that ruled over their social groups. The primatologist Richard Wrangham at Harvard University argues that this profoundly shaped human society because, along with eliminating the alphas, our ancestors also selected against the human trait of aggression. As a result, humans became more cooperative, and their societies became more egalitarian.

    But if that is the case, how do we explain the undeniable and growing inequality in today’s societies, where huge amounts of power and money are concentrated among a small number of people, with the problem particularly pronounced in major economies such as the US, the UK and China? Some researchers argue that humans are not egalitarian by nature, but that living in small, close-knit groups of hunter-gatherers – as people did before the dawn of agriculture – suppressed our tendencies to form dominance-based hierarchies. They see a U-shaped curve of human egalitarianism. The point we started from – which looked a lot like the social structures we see in other great apes – is where we have ended up again, with the middle of the U showing a brief flirtation with social equality.
    If human nature were entirely cooperative then state control wouldn’t be required to prevent freeloadingZoonar GmbH/Alamy
    I agree that we aren’t naturally egalitarian. In fact, I am not convinced that human societies were ever egalitarian. As anthropologists point out, even living hunter-gatherers have some brutal practices, such as infanticide. But, for me, the explanation for our current unequal circumstances lies not in our ancestors having selected against aggression, but in how the elimination of despotic alpha males allowed other, arguably more insidious people to succeed.
    Once humanity was free of the strong grip of strict dominance hierarchies led by alpha males, success in social groups would have become more about skilful manoeuvring within communities. This led to the rise of a particular kind of social intelligence called Machiavellian intelligence, which entails the cunning manipulation of others. In the language of evolutionary biology, we have a cooperation dilemma: there are situations where it is in our interest to work with others, and there are situations where it is not. And, as anyone who has watched an episode of The Traitors will be aware, the need to pull together and the need to betray can come into conflict. As a result, overt rivalry was superseded by what I call “invisible rivalry” – the ability to hide selfish, competitive or exploitative intentions while maintaining the appearance of a cooperative nature. In other words, we evolved to compete in a cooperative world.
    The social brain
    Support for this idea comes from the size of the human brain. All primates have large brains relative to their body size, and ours is exceptionally big. The social brain hypothesis suggests that these large brains evolved to help individuals manage their unusually complex social systems. Of course, cooperation is part of this, but it can’t be the whole story. Consider ants, which, in terms of numbers and pervasiveness, are probably the most successful group of species on Earth. They are eusocial, which means they cooperate so fully that they seem to act as a single organism. Yet their brains are tiny, and everything they need to work together is programmed within them genetically. So, you don’t necessarily need a big brain to cooperate – but you do need one to compete strategically. Indeed, research suggests that social competition is what best explains the evolution of our enormous brain compared with the big brains of other primates.
    To paraphrase Aristotle, we are political animals – not merely social ones. We strategise within our societies to maximise our success, whether that is defined in terms of money, power, mating success, hunting prowess or any of the other qualities associated with prestige around the world. To do so effectively, we evolved to not just be smart enough to cooperate, but to cooperate selectively – and to betray others when it suits us, or even just when we can get away with it.

    Studies by economists and psychologists illustrate this. For example, in one set of experiments, participants were paired in a cooperation game in which one person was given $10 and the choice to share it with the other (or not). A lot of research shows that in these circumstances, people generally give some money to their partner, often splitting the pot equally, even when there is no obvious punishment for betraying them. But this time, the researchers gave some participants another option: they could take less money and leave the game without their partner ever knowing that they had been involved in a cooperation game. About one-third of participants took this option. It was as if they were happy to pay to have their betrayal left unknown.
    Experiments like this tell us a lot about the human psyche. In social interactions, we often need to be able to predict what others around us are going to do – to learn where to place trust effectively, to win trust when we need it and to hide betrayals of trust on our own part. These abilities require empathy, emotion, language and, perhaps, as some of my colleagues argue, consciousness. Yet those same abilities, and that same intelligence, have a dangerous downside. Our evolved proclivity for maximising resources leads us to exploit those around us – and some people are so effective at deception that they risk damaging their societies. Modern, extreme inequality is an example of this process in action. So too are past political upheavals leading to degradation of the rule of law – and sometimes the fall of civilisations. The Roman Republic, for example, collapsed because of a tremendous internal struggle for power, culminating in Julius Caesar’s Machiavellian machinations, eventually leading to autocracy.
    Religion is one institution that societies use to promote cooperationAdam Guz/Getty Images Poland/Getty Images
    So, our dual cooperative-competitive nature means that we face an enemy within that may bring down society. And this is where the analogy with cancer arises. Humanity’s long history of living with the disease means we have evolved biological mechanisms to reduce the risk it poses. Many reactions at the cellular level, including attacks by immune cells and programmed cell death, evolved to help our bodies fight off cancers, as well as other, external threats to our survival. It is this stalwart immune system that explains why, although mutations occur all the time and we are frequently exposed to viruses and bacteria, these often don’t lead to symptoms or illness. Similarly, the threats to our social groups posed by the evolution of invisible rivalry led us to develop practices, behaviours and institutions to maximise cooperation and thwart our Machiavellian tendencies. In my new book, Invisible Rivals: How we evolved to compete in a cooperative world, I call this our cultural immune system.
    Religion is one institution that can function in this way. Religious teaching can promote cooperation among everyone who practises it – and this is one possible reason that the Golden Rule, often summed up as “treat others as you would like to be treated”, is found independently in scriptures across the world. People who believe these scriptures – who internalise them, as anthropologists say – are more likely to help fellow members of their group.
    Everywhere anthropologists look, they find other practices and institutions that bolster cooperation at the local level. In cultures that rely on fishing, there are strong norms against over-fishing, which would deplete the stock for everyone. Where people are dependent on hunting, there are strict rules about how meat is shared and who gets credit for a kill. The Maasai people of Kenya and Tanzania have a relationship framework called osotua, rooted in need-based sharing partnerships and relying on mutual help in hard times. For example, if someone needs cattle because theirs have all died, another member of the group will help, not because they get anything directly in return, but simply because their neighbour’s needs are greater at that time. This creates a special bond – osotua translates as “umbilical cord” – and treachery is rare because the bond is seen as sacred.
    The Maasai people have a system called osotua whereby they give cattle to others in needSiegfried Modola/Getty Images
    Across the world, social norms that guide behaviours have evolved, and they have been refined over thousands of years of trial and error through cultural evolution. However, just as cancers find ways to evade our immune systems, so some individuals use their Machiavellian intelligence to subvert the group’s social norms for their own benefit. This is trickier to do in small-scale societies where people all know each other, making rule-breakers easier to detect and punish. But as societies grew over the past 10,000 years, so did opportunities to act selfishly. Agricultural networks, cities and, finally, nation states made deception much easier to pull off, because it is easy to cheat more people without getting caught in a group where it is impossible to know everyone personally.
    Taming our Machiavellian nature
    It is this lethal combination of opportunity and invisible rivalry that makes the question of whether humans are cooperative or competitive so relevant today. To fight the enemy within society, we need to understand that both traits are in our nature, and that we evolved to apply whichever suits us best. Thinking that we are either one or the other leaves us vulnerable to facile arguments about how we should structure society. If we are purely selfish, it follows that society should focus on heavy policing and punishment of freeloaders, including those in power. But believing that we are intrinsically altruistic is equally detrimental because it risks ignoring the threat posed by rampant self-interest.
    Suppressing humanity’s Machiavellian side is certainly harder in large-scale societies. But there are basic ways that we can boost the cultural immune system, much like how we can improve our biological immune systems through healthy lifestyles and vaccination. The key, I believe, is to learn more about the social norms that small-scale societies have evolved to help them thrive and stave off opportunistic cheaters and then use this knowledge to create policies that promote cooperation at a higher level. For example, within our own communities, we can look to cultures that promote systems like need-based transfers and others that have found ways to share resources more equitably.

    But this isn’t going to happen until we first recognise the problem that invisible rivalry poses. In my view, the best way to do that is through education. We are all part of the cultural immune system. If we understand our evolutionary heritage, we will be alert to the danger that freeloaders pose to society and place our trust more discerningly – much as the body’s defence system learns to recognise the agents associated with cancers and other diseases to deal with them. Crucially, we also need to recognise that cooperation is best for everyone in the long term.
    A small proportion of people at the most competitive end of the spectrum will always try to game society. We must work together to stay one step ahead of humanity’s opportunistic nature. Without beliefs, norms and a proper understanding of human nature, we are at the mercy of our selfish biological heritage. Evolution has made us this way, but we can learn to overcome it.

    Topics:psychology/human evolution More