More stories

  • in

    What were ancient humans thinking when they began to bury their dead?

    Westend61 GmbH/Alamy
    Some people will tell you that Homo naledi was a small-brained hominin with some big thoughts. Two years ago, a team led by Lee Berger at the University of the Witwatersrand, South Africa, concluded that H. naledi – a species that lived around 335,000 to 245,000 years ago and had a brain about one-third the size of yours – invented a complex ritual that involved burying its dead in a deep and difficult-to-access cave chamber.  
    This idea didn’t go down well: all four of the anonymous researchers asked to assess its merit were sceptical. But Berger and his colleagues were undeterred. Earlier this year, they published an updated version of their study, offering a deeper dive into the evidence they had gathered from the Rising Star cave system in South Africa. The approach paid off: two of the original reviewers agreed to reassess the science – and one was won over. “You rarely see that in peer review,” says John Hawks at the University of Wisconsin-Madison, a member of Berger’s team.  
    Many other researchers, however, are still wary. “I’m just not convinced by any of it,” says Paul Pettitt at Durham University, UK. To appreciate why, it is necessary to explore how other ancient hominins interacted with the dead. Doing so can help us figure out which species carried out burials, how ancient the practice is and what it says about the minds and motivations of those doing it. Considering this also reveals why, if H. naledi really did bury its dead, that would fundamentally challenge our understanding of early hominin cognition and behaviour.  

    There is one archaeological site that has much in common with Rising Star: Sima de los Huesos (the “pit of bones”) in northern Spain. There, researchers have uncovered the remains of 29 hominins, thought to be an ancestor of Neanderthals, at the bottom of a vertical shaft within a cave. The consensus is that the Sima hominins, who lived between 430,000 and 300,000 years ago, died elsewhere and that their bodies were then dropped into the pit. If so, this represents the oldest clear evidence for some sort of funerary behaviour.   
    Such an early date may seem surprising, but in context, it makes sense. We know that chimpanzees show an interest in dead group members, grooming their fur and even cleaning their teeth. “If we have chimpanzees behaving this way, then we might expect similar behaviour deep in our evolutionary past,” says Pettitt. However, the funerary behaviour on show at Sima appears more sophisticated than anything chimps do, says María Martinón-Torres at the National Human Evolution Research Centre (CENIEH) in Spain. “They have chosen a place to put the dead.” What’s more, the excavation also unearthed a stone hand axe, which is sometimes interpreted as a funerary offering – although it could simply have been in a pouch worn by one of the hominins found there, says Pettitt.  

    Such elaborate treatment of the dead may have been evolutionarily beneficial. At some point in prehistory – perhaps when brains reached a certain size – hominins must have become aware of their own mortality, says Pettitt. In a 2023 paper, he suggested that complex funerary behaviour might then have arisen to mitigate personal anxiety about death by bringing the community together when a group member died. This scenario could explain what happened at Sima, given that the average brain size of these hominins was 1237 cubic centimetres – only about 100 cubic centimetres less than the average modern human.  
    The idea that members of Homo naledi buried their dead is contentious because their brains were so smallImago/Alamy
    Others see something more sinister at Sima. Mary Stiner at the University of Arizona points out that many of the skeletons are from adolescents or young adults. “That’s an age group in which individuals choose to take risks and are more vulnerable due to low experience,” she says. Moreover, there are signs on the bones that some of the Sima hominins died violently. Stiner thinks the skeletons may represent youngsters who left their family group, strayed into hostile territory and came to a grisly end – their bodies tossed into the pit by their killers, perhaps to hide the evidence. But as Pettitt points out, that would require an unusually large number of adolescents making the same mistakes and meeting a similar fate.   
    For now, it is difficult to know exactly how to interpret the Sima site. Fortunately, more evidence may soon be available. Since 2021, Nohemi Sala at CENIEH and her colleagues have been exploring the archaeological record of funerary behaviour through a project known as DEATHREVOL. Sala says the research suggests that there are other similarly ancient sites in Europe that may preserve evidence of the same behaviour recorded at Sima – although she won’t name them until the work is published. “There are four or five candidates to explore these patterns,” she says. “It’s more than just Sima.”  
    Neanderthal burials
    Eventually, hominins like those at Sima gave rise to the Neanderthals, who had different ways of treating the dead. Some of the clearest evidence for this comes from Shanidar cave, a site in northern Iraq where, since the mid-20th century, the remains of at least 10 Neanderthals have been discovered. The oldest dates back about 75,000 years, making it among the oldest known Neanderthal burials. Another set of remains was pivotal to us recognising in the late 20th century that Neanderthals shared our humanity, because pollen around this individual’s bones suggested that they had been buried with flowers. Today, while nobody doubts Neanderthals’ humanity, few archaeologists buy the “flower burial” idea. Recent excavations at Shanidar point to an alternative explanation for the pollen. Chris Hunt at Liverpool John Moores University, UK, and his colleagues think the body may have been placed in the ground and then buried under a pile of brushwood rather than dirt. They note that some of the pollen around the skeleton comes from plants with prominent spikes, possibly added to deter scavengers.   
    Nevertheless, the Shanidar burials are revealing. One was of a man who managed to live with severe injuries to his face, shoulder and arm. Stiner is among several researchers who think he would have required help to do so, suggesting Neanderthals cared for and valued each other as individuals. If they did, then death wasn’t merely the loss of a pair of hands for sourcing food; it was the loss of someone with a unique personality who would be missed – leading to a new motivation behind funerary behaviour. “These societies were bound by love and affection,” says Stiner.  
    Five of the skeletons at Shanidar hint at something else. They were all buried in the same spot in the shadow of a prominent landmark – a 2-metre-high rock inside the cave – over the course of a few decades to a few millennia. Hunt and his colleagues think this might be a sign that Neanderthals tied meaning to landmarks in their environment. More speculatively, burying the dead here may even have played a role in legitimising the right of the living to the nearby land and its resources. Our species can have that sort of relationship with land, says Emma Pomeroy at the University of Cambridge, who was also involved in the recent excavations at Shanidar. “I think it’s very interesting to think about whether Neanderthals had a similar attitude to the landscape.”  
    Shanidar cave in Iraq contains some of the oldest and most convincing Neanderthal burialsMatyas Rehak/Alamy
    Mysteries remain. A big one is why only a few of the Neanderthals who lived around Shanidar were buried in the cave. “If this was something that hominins did a lot, the caves would be chock-a-block with bodies,” says Hawks. Evidence from elsewhere indicates that other Neanderthal deaths may have been honoured with different funerary treatments, including ritual cannibalism – but for some as-yet-unfathomable reason, very few Neanderthals ended up interred in the cave. Another question is whether Neanderthals devised the idea of burial themselves or learned it from our species, Homo sapiens, whom they met around the time of the Shanidar burials.  
    What we do know is that our species began burying its dead around 120,000 to 100,000 years ago. And some early H. sapiens burials appear to differ from those of Neanderthals by the inclusion of grave goods. For instance, a body in Qafzeh cave in Israel appears to have been buried with red deer antlers clasped to its chest – although other interpretations are possible. “Perhaps the antler was used to dig the grave and it’s just a fortuitous association,” says Pettitt. We don’t know how common early grave goods were, in part because human burials were so rare before 28,000 years ago. Neither do we know their exact significance, although in later burials they are generally seen as reflecting things like the status and occupation of the deceased.  
    The graves of young children
    Rare though they are, early human burials reveal intriguing signs of a pattern. In 2021, a team including Martinón-Torres and Michael Petraglia, now at Griffith University, Australia, described an excavation at Panga ya Saidi cave in Kenya in which they had unearthed the 78,000-year-old burial of a toddler they named Mtoto. The researchers noted that Mtoto is the earliest of three potential H. sapiens burials in Africa, which date to between 78,000 and 68,000 years ago. All three involved young children.   
    Childhood mortality was probably relatively high in these early communities, says Petraglia. “We don’t have the evidence to say for sure, but we suspect so because childhood mortality is pretty high in hunter-gatherer societies.” Even so, some children’s deaths might have been “particularly painful”, says Martinón-Torres, motivating early communities to commemorate them with what was, at the time, an unusual funerary ritual: burial. Pettitt has explored this idea. He distinguishes “good deaths”, which usually occur in old age, from “bad deaths”, which occur unexpectedly and often involve children. The latter may have provided an impetus for people to perform special funerary rites, he suggests, which might help explain burials like Mtoto’s.  
    Another clue to the thinking of these Stone Age people comes from the fact that Panga ya Saidi cave was a place of human habitation on and off for thousands of years. This suggests a decision was made to inter Mtoto’s small body in close proximity to the community’s living space. “If you bury someone you love, in a way, you don’t want them to go,” says Martinón-Torres. Placing them in an easy-to-visit location may help maintain a close connection, she adds.  

    So, what does all this tell us about whether H. naledi buried its dead?   
    There are certainly echoes of other sites in Rising Star. The idea that, hundreds of thousands of years ago, hominins placed their dead deep inside a cave draws parallels with Sima de los Huesos. The suggestion that H. naledi repeatedly returned to the same site to inter bodies seems to mirror the situation at Shanidar cave. And the discovery of a crescent-shaped stone near the fossilised hand of one H. naledi skeleton – a possible grave good – looks like behaviour seen at sites like Qafzeh.  
    But the burial hypothesis also seems all wrong. The biggest stumbling block is the size of H. naledi’s brain, which, at an average of 513 cubic centimetres, was tiny. For a start, it raises doubts about whether individuals really were aware of their own mortality, inventing elaborate funerary rituals to come to terms with this revelation. There is also no evidence yet that the species cared for its sick, a potential sign that group members were valued as individuals whose deaths were mourned. And although youngsters are overrepresented in the Rising Star cave – potentially consistent with Pettitt’s “bad death” idea – the chamber in which the bones were found doesn’t seem to be an easy-to-visit location that would allow the living to maintain a connection with the dead. “It’s quite anomalous, but also fascinating,” says Stiner.  
    Red deer antlers in a grave at Qafzeh, Israel, may have a symbolic meaningUniversal History Archive/Shutterstock
    There are two ways to interpret this puzzle. One is to look for non-burial scenarios that could explain the accumulation of the H. naledi skeletons. For instance, in 2021, researchers reported finding the remains of 30 baboons, nine of them mummified, in a cave chamber in South Africa. It seems that the primates had used the cave as a sleeping site over many years, with some occasionally dying there and their bodies gradually accumulating. Perhaps H. naledi used Rising Star in a similar way. “We need to consider whether that might be a factor,” says Pomeroy.   
    The other, more radical, option is to ask whether our understanding of how and why hominins developed funerary traditions requires a rethink. “Spirituality, the idea of self-awareness and mortality – all could have arisen many times independently,” says Berger. Hawks points out that analysis of H. naledi skeletons suggests that, like us, they had a long childhood – and that could be the key. “Extended childhoods have an adaptive purpose: they enable kids to integrate into social groups in a way that isn’t sexually competitive,” he says. They may also have encouraged members of H. naledi to develop funerary customs to help their youngsters understand the death of group members. “We have funerals to explain to kids what just happened,” says Hawks.   
    Unfortunately, gathering evidence to confirm the burial idea is more difficult than it might seem. Talk of burial may conjure up images of modern cemeteries, but Stone Age graves aren’t like that. “They’re not 6 feet under in well-constructed holes,” says Hawks: the oldest burial pits were usually shallow depressions in the floor. If hominins then returned to inter more dead, they could easily disturb earlier graves and create a jumble of bones that is difficult to interpret as a set of burials.   

    The good news, say Berger, Hawks and their colleagues, is that there is plenty more untouched material at Rising Star, which could, in the future, strengthen their burial hypothesis. If they can do that, they may well find a surprisingly receptive audience. As we have seen, ancient burials are open to interpretation, conclusions are provisional and many of the archaeologists working on these sites would like nothing more than new discoveries that challenge their ideas about the prehistory of funerary behaviour.   
    “It’s sometimes suggested that the scientific community just doesn’t want to believe that a small-brained hominin would be capable of symbolic treatment of the dead,” says Pomeroy. “That couldn’t be further from the truth. We’d be so excited – if there was good evidence.” 

    Topics:human evolution/ancient humans More

  • in

    Neanderthal groups had their own local food culture

    An illustration of a Neanderthal group preparing foodLUIS MONTANYA/MARTA MONTANYA/SCIENCE PHOTO LIBRARY
    Neanderthals may have had traditional ways of preparing food that were particular to each group. Discoveries from two caves in what is now northern Israel suggest that the residents there butchered the same kinds of prey in their own distinctive ways.
    Modern humans, or Homo sapiens, weren’t the first hominins to prepare and cook food. There is evidence that Neanderthals, for example, which inhabited Europe and Asia until about 40,000 years ago, used flint knives to butcher what they caught, cooked a wide range of animals and spiced up their menu with wild herbs.

    To learn more about Neanderthal food culture, Anaëlle Jallon at the Hebrew University of Jerusalem and her colleagues examined evidence at the caves of Amud and Kebara in northern Israel.
    These sites, which are just some 70 kilometres apart, provide a unique opportunity to examine local cultural differences. Stone tools, food remains and hearths found at each site reveal that Neanderthals occupied both caves, probably during winters, during the same time period.
    “You find the same species of animals to hunt and it’s more or less the same landscape,” says Jallon. “It will be the same kind of weather, and Neanderthals at both ate mostly gazelles and some fallow deer that they complemented with a few bigger animals like boar or aurochs.”

    There are a few differences, though. For example, bones reveal that a greater amount of large prey was hunted at Kebara, and more kills were carried back to that cave to be butchered.
    Jallon and her colleagues used microscopes to inspect bones from layers of sediment at the two sites from between 50,000 and 60,000 years ago, examining the cuts slashed in them with stone tools.
    They found that even though the flint tools used were similar at both sites, the patterns of cuts were different. “The cuts tend to be more variable in their width and depth in Kebara, and in Amud they are more concentrated in big clusters and they overlap each other more often,” says Jallon.
    To assess if the differences could be down to butchering different prey, the researchers also looked specifically at long bones from gazelles found at both sites. These had the same differences.
    “We are talking about two groups who live very close and, let’s say, both cutting up some beef – but in one site they seem to be cutting closer to the bone, getting all the meat off,” says Ceren Kabukcu at the University of Liverpool, UK.
    Previous research that looked at cut marks on bones from more recent societies suggests that the kind of variation seen in Neanderthal butchery isn’t down to a lack of expertise, but to a difference in technique.

    Jallon thinks the contrast is best explained by deliberate butchery choices. It could be that Neanderthals at Amud made their meat harder to process by, for example, drying it or letting it hang before cooking, she says, which would have meant they needed more cuts to get through it or a larger team of people to butcher the meat.
    “In behaviour that is as opportunistic as butchering, you would expect to find the most efficient way to butcher something to get the most out of it, but apparently, it was more determined by social or cultural factors,” says Jallon. “It could be due to group organisation or practices that are learned and transmitted from generation to generation.”
    “The fact that there might be differences and some nuance on how technology is used in daily life is not entirely shocking,” says Kabukcu. “I think as this question is investigated, we might see more and more nuance at several sites of the Middle Palaeolithic.”
    It isn’t known whether the caves were occupied at the same time or if disparate groups might have been in contact with each other. “It is a possibility that it was at the same exact time, but it’s also possible it was hundreds of years apart or more. We don’t have the resolution to know that,” says Jallon.
    But she also says that the pattern of very clustered cut marks found in Amud is similar in the oldest layer and in the younger layers, so she says the cave might have been used by returning groups that maintained the same butchery traditions for centuries.

    Topics:Neanderthals/ancient humans More

  • in

    Provocative new book says we must persuade people to have more babies

    A large population may enable innovation and economies of scalePHILIPPE MONTIGNY/iStockphoto/Get​ty Images
    After the SpikeDean Spears and Michael Geruso (Bodley Head (UK); Simon & Schuster (US))
    Four-Fifths of all the humans who will ever be born may already have been born. The number of children being born worldwide each year peaked at 146 million in 2012 and has been falling overall ever since. This means that the world’s population will peak and start to fall around the 2080s.
    This fall won’t be gradual. With birth rates already well below replacement levels in many countries including China and India, the world’s population will plummet as fast as it rose. In three centuries, there could be fewer than 2 billion people on Earth, claims a controversial new book.
    “No future is more likely than that people worldwide choose to have too few children to replace their own generation. Over the long run, this would cause exponential population decline,” write economists Dean Spears and Michael Geruso in After the Spike: The risks of global depopulation and the case for people.
    This, you might think, could be a good thing. Won’t it help solve many environmental issues facing us today? No, say the authors. Take climate change: their argument isn’t that population size doesn’t matter, but that it changes so slowly that other factors such as how fast the world decarbonises matter far more. The window of opportunity for lowering carbon dioxide emissions by reducing population has largely passed, they write.
    Spears and Geruso also make the case that there are many benefits to having a large population. For instance, there is more innovation, and economies of scale make the manufacture of things like smartphones feasible. “We get to have nice phones only because we have a lot of neighbors on this planet,” they write.
    So, in their view, our aim should be to stabilise world population rather than letting it plummet. The problem is we don’t know how, even with the right political will.

    As we grow richer, we are more reluctant to abandon career and leisure opportuntiies to have children

    While some government policies have had short-term effects, no country has successfully changed long-term population trends, argue the authors. Take China’s one-child policy. It is widely assumed to have helped reduce population growth – but did it? Spears and Geruso show unlabelled graphs of the populations of China and its neighbours before, during and after the policy was in place, and ask the reader which is China. There is no obvious difference.
    Attempts to boost falling fertility rates have been no more successful, they say. Birth rates jumped after Romania banned abortion in 1966, but they soon started to fall again. Sweden has tried the carrot rather than the stick by heavily subsidising day care. But the fertility rate there has been falling even further below the replacement rate.
    All attempts to boost fertility by providing financial incentives are likely to fail, Spears and Geruso argue. While people might say they are having fewer children because they cannot afford larger families, the global pattern is, in fact, that as people become richer they have fewer children.
    Rather than affordability being the issue, it is more about people deciding that they have better things to do, the authors say. As we grow richer, we are more reluctant to abandon career and leisure opportunities to have children. Even technological advances are unlikely to reverse this, they say.
    On everything other than the difficulty of stabilising the population, this is a relentlessly optimistic book. For instance, say the authors, dire predictions of mass starvation as the world’s population grew have been shown to be completely wrong. The long-term trend of people living longer and healthier lives can continue, they suggest. “Fears of a depleted, overpopulated future are out of date,” they write.
    Really? Spears and Geruso also stress that the price of food is key to determining how many go hungry, but fail to point out that food prices are now climbing, with climate change an increasing factor. I’m not so sure things are going to keep getting better for most people.
    This book is also very much a polemic: with Spears and Geruso labouring their main points, it wasn’t an enjoyable read. That said, if you think that the world’s population isn’t going to fall, or that it will be easy to halt its fall, or that a falling population is a good thing, you really should read it.

    New Scientist book club

    Love reading? Come and join our friendly group of fellow book lovers. Every six weeks, we delve into an exciting new title, with members given free access to extracts from our books, articles from our authors and video interviews.

    Topics: More

  • in

    Evolution has made humans both Machiavellian and born socialists

    David Oliete
    Nearly 2 million years ago, one of our hominin ancestors developed bone cancer in their foot. The fate of this individual is unknown, but their fossilised remains leave no doubt that cancer has been a part of our story for a very long time. It is also clear that, when threatened by our own normally cooperative cells turning against us, we evolved an immune system to help us identify and deal with the enemy within.
    But treacherous cancer cells weren’t the only internal threat our ancestors faced. As hypersocial beings, their survival was also jeopardised by selfish individuals attempting to subvert the group – and capable of unravelling society, just as a cancer eventually kills its host. I am interested in understanding how we adapted to this threat. At the heart of the story is this question: is human nature selfish or altruistic, competitive or cooperative? Are we essentially cancers, tamed by culture, or more like healthy cells in the human body, working together for mutual success?
    People have debated this for centuries and continue to do so, citing research in primatology, anthropology, psychology and economics to defend their points. The answer has profound implications for how we aim to structure society. If we are born altruists, then institutions and hierarchies are unnecessary. But if selfishness prevails, strong control is essential. To me, both extremes are unconvincing. To understand why, we must appreciate the circumstances under which humanity has evolved. Determining how our ancestors confronted brutish selfishness doesn’t just tell us about our own social past – it can also help us inoculate today’s societies against the threat from within.

    Look at the animals to which we are most closely related and you see that each species has its own distinct set of social structures. Among gorillas, for example, groups of unmated males are typically led by aggressive alpha males. Mated gorillas sometimes live in groups of males and females, but more often it is the stereotypical silverback with a harem of females – a group that has its own hierarchy. Chimpanzees and bonobos also display dominance hierarchies, with a lot of emphasis placed on female social rank, particularly among bonobos. Despite the wide variation in sociality among these species, one thing is consistent: where there is social rank, aggressive dominance is the winning attribute. If an alpha can successfully defend resources, whether territory, food or mates, it can dominate a primate society. Access to more resources translates into having more surviving offspring than others, which is the only measure of success for evolution by natural selection.
    Human self-domestication
    Among our own ancestors – members of the genus Homo – the story is different. Research in anthropology and primatology suggests that, as early people evolved more complex social structures, they did something unseen in other primates: they domesticated themselves. Once they had the cognitive sophistication necessary to create weapons, along with the intelligence required to form alliances, they could fight the large, angry dominants that ruled over their social groups. The primatologist Richard Wrangham at Harvard University argues that this profoundly shaped human society because, along with eliminating the alphas, our ancestors also selected against the human trait of aggression. As a result, humans became more cooperative, and their societies became more egalitarian.

    But if that is the case, how do we explain the undeniable and growing inequality in today’s societies, where huge amounts of power and money are concentrated among a small number of people, with the problem particularly pronounced in major economies such as the US, the UK and China? Some researchers argue that humans are not egalitarian by nature, but that living in small, close-knit groups of hunter-gatherers – as people did before the dawn of agriculture – suppressed our tendencies to form dominance-based hierarchies. They see a U-shaped curve of human egalitarianism. The point we started from – which looked a lot like the social structures we see in other great apes – is where we have ended up again, with the middle of the U showing a brief flirtation with social equality.
    If human nature were entirely cooperative then state control wouldn’t be required to prevent freeloadingZoonar GmbH/Alamy
    I agree that we aren’t naturally egalitarian. In fact, I am not convinced that human societies were ever egalitarian. As anthropologists point out, even living hunter-gatherers have some brutal practices, such as infanticide. But, for me, the explanation for our current unequal circumstances lies not in our ancestors having selected against aggression, but in how the elimination of despotic alpha males allowed other, arguably more insidious people to succeed.
    Once humanity was free of the strong grip of strict dominance hierarchies led by alpha males, success in social groups would have become more about skilful manoeuvring within communities. This led to the rise of a particular kind of social intelligence called Machiavellian intelligence, which entails the cunning manipulation of others. In the language of evolutionary biology, we have a cooperation dilemma: there are situations where it is in our interest to work with others, and there are situations where it is not. And, as anyone who has watched an episode of The Traitors will be aware, the need to pull together and the need to betray can come into conflict. As a result, overt rivalry was superseded by what I call “invisible rivalry” – the ability to hide selfish, competitive or exploitative intentions while maintaining the appearance of a cooperative nature. In other words, we evolved to compete in a cooperative world.
    The social brain
    Support for this idea comes from the size of the human brain. All primates have large brains relative to their body size, and ours is exceptionally big. The social brain hypothesis suggests that these large brains evolved to help individuals manage their unusually complex social systems. Of course, cooperation is part of this, but it can’t be the whole story. Consider ants, which, in terms of numbers and pervasiveness, are probably the most successful group of species on Earth. They are eusocial, which means they cooperate so fully that they seem to act as a single organism. Yet their brains are tiny, and everything they need to work together is programmed within them genetically. So, you don’t necessarily need a big brain to cooperate – but you do need one to compete strategically. Indeed, research suggests that social competition is what best explains the evolution of our enormous brain compared with the big brains of other primates.
    To paraphrase Aristotle, we are political animals – not merely social ones. We strategise within our societies to maximise our success, whether that is defined in terms of money, power, mating success, hunting prowess or any of the other qualities associated with prestige around the world. To do so effectively, we evolved to not just be smart enough to cooperate, but to cooperate selectively – and to betray others when it suits us, or even just when we can get away with it.

    Studies by economists and psychologists illustrate this. For example, in one set of experiments, participants were paired in a cooperation game in which one person was given $10 and the choice to share it with the other (or not). A lot of research shows that in these circumstances, people generally give some money to their partner, often splitting the pot equally, even when there is no obvious punishment for betraying them. But this time, the researchers gave some participants another option: they could take less money and leave the game without their partner ever knowing that they had been involved in a cooperation game. About one-third of participants took this option. It was as if they were happy to pay to have their betrayal left unknown.
    Experiments like this tell us a lot about the human psyche. In social interactions, we often need to be able to predict what others around us are going to do – to learn where to place trust effectively, to win trust when we need it and to hide betrayals of trust on our own part. These abilities require empathy, emotion, language and, perhaps, as some of my colleagues argue, consciousness. Yet those same abilities, and that same intelligence, have a dangerous downside. Our evolved proclivity for maximising resources leads us to exploit those around us – and some people are so effective at deception that they risk damaging their societies. Modern, extreme inequality is an example of this process in action. So too are past political upheavals leading to degradation of the rule of law – and sometimes the fall of civilisations. The Roman Republic, for example, collapsed because of a tremendous internal struggle for power, culminating in Julius Caesar’s Machiavellian machinations, eventually leading to autocracy.
    Religion is one institution that societies use to promote cooperationAdam Guz/Getty Images Poland/Getty Images
    So, our dual cooperative-competitive nature means that we face an enemy within that may bring down society. And this is where the analogy with cancer arises. Humanity’s long history of living with the disease means we have evolved biological mechanisms to reduce the risk it poses. Many reactions at the cellular level, including attacks by immune cells and programmed cell death, evolved to help our bodies fight off cancers, as well as other, external threats to our survival. It is this stalwart immune system that explains why, although mutations occur all the time and we are frequently exposed to viruses and bacteria, these often don’t lead to symptoms or illness. Similarly, the threats to our social groups posed by the evolution of invisible rivalry led us to develop practices, behaviours and institutions to maximise cooperation and thwart our Machiavellian tendencies. In my new book, Invisible Rivals: How we evolved to compete in a cooperative world, I call this our cultural immune system.
    Religion is one institution that can function in this way. Religious teaching can promote cooperation among everyone who practises it – and this is one possible reason that the Golden Rule, often summed up as “treat others as you would like to be treated”, is found independently in scriptures across the world. People who believe these scriptures – who internalise them, as anthropologists say – are more likely to help fellow members of their group.
    Everywhere anthropologists look, they find other practices and institutions that bolster cooperation at the local level. In cultures that rely on fishing, there are strong norms against over-fishing, which would deplete the stock for everyone. Where people are dependent on hunting, there are strict rules about how meat is shared and who gets credit for a kill. The Maasai people of Kenya and Tanzania have a relationship framework called osotua, rooted in need-based sharing partnerships and relying on mutual help in hard times. For example, if someone needs cattle because theirs have all died, another member of the group will help, not because they get anything directly in return, but simply because their neighbour’s needs are greater at that time. This creates a special bond – osotua translates as “umbilical cord” – and treachery is rare because the bond is seen as sacred.
    The Maasai people have a system called osotua whereby they give cattle to others in needSiegfried Modola/Getty Images
    Across the world, social norms that guide behaviours have evolved, and they have been refined over thousands of years of trial and error through cultural evolution. However, just as cancers find ways to evade our immune systems, so some individuals use their Machiavellian intelligence to subvert the group’s social norms for their own benefit. This is trickier to do in small-scale societies where people all know each other, making rule-breakers easier to detect and punish. But as societies grew over the past 10,000 years, so did opportunities to act selfishly. Agricultural networks, cities and, finally, nation states made deception much easier to pull off, because it is easy to cheat more people without getting caught in a group where it is impossible to know everyone personally.
    Taming our Machiavellian nature
    It is this lethal combination of opportunity and invisible rivalry that makes the question of whether humans are cooperative or competitive so relevant today. To fight the enemy within society, we need to understand that both traits are in our nature, and that we evolved to apply whichever suits us best. Thinking that we are either one or the other leaves us vulnerable to facile arguments about how we should structure society. If we are purely selfish, it follows that society should focus on heavy policing and punishment of freeloaders, including those in power. But believing that we are intrinsically altruistic is equally detrimental because it risks ignoring the threat posed by rampant self-interest.
    Suppressing humanity’s Machiavellian side is certainly harder in large-scale societies. But there are basic ways that we can boost the cultural immune system, much like how we can improve our biological immune systems through healthy lifestyles and vaccination. The key, I believe, is to learn more about the social norms that small-scale societies have evolved to help them thrive and stave off opportunistic cheaters and then use this knowledge to create policies that promote cooperation at a higher level. For example, within our own communities, we can look to cultures that promote systems like need-based transfers and others that have found ways to share resources more equitably.

    But this isn’t going to happen until we first recognise the problem that invisible rivalry poses. In my view, the best way to do that is through education. We are all part of the cultural immune system. If we understand our evolutionary heritage, we will be alert to the danger that freeloaders pose to society and place our trust more discerningly – much as the body’s defence system learns to recognise the agents associated with cancers and other diseases to deal with them. Crucially, we also need to recognise that cooperation is best for everyone in the long term.
    A small proportion of people at the most competitive end of the spectrum will always try to game society. We must work together to stay one step ahead of humanity’s opportunistic nature. Without beliefs, norms and a proper understanding of human nature, we are at the mercy of our selfish biological heritage. Evolution has made us this way, but we can learn to overcome it.

    Topics:psychology/human evolution More

  • in

    70,000 years ago humans underwent a major shift – that’s why we exist

    Ancient humans adapted to deeper forests as they migrated out of Africa and away from the savannahLIONEL BRET/EURELIOS/SCIENCE PHOTO LIBRARY
    This is an extract from Our Human Story, our newsletter about the revolution in archaeology. Sign up to receive it in your inbox every month.
    Humans come from Africa. This wasn’t always obvious, but today it seems as close to certain as anything about our origins.
    There are two senses in which this is true. The oldest known hominins, creatures more closely related to us than to great apes, are all from Africa, going back as far as 7 million years ago. And the oldest known examples of our species, Homo sapiens, are also from Africa.
    It’s the second story I’m focusing on here, the origin of modern humans in Africa and their subsequent expansion all around the world. With the advent of DNA sequencing in the second half of the 20th century, it became possible to compare the DNA of people from different populations. This revealed that African peoples have the most variety in their genomes, while all non-African peoples are relatively similar at the genetic level (no matter how superficially different we might appear in terms of skin colour and so forth).
    In genetic terms, this is what we might call a dead giveaway. It tells us that Africa was our homeland and that it was populated by a diverse group of people – and that everyone who isn’t African is descended from a small subset of the peoples, who left this homeland to wander the globe. Geneticists were confident about this as early as 1995, and the evidence has only accumulated since.
    And yet, the physical archaeology and the genetics don’t match – at least, not on the face of it.

    Genetics tells us that all living non-African peoples are descended from a small group that left the continent around 50,000 years ago. Barring some wobbles about the exact date, that has been clear for two decades. But archaeologists can point to a great many instances of modern humans living outside Africa much earlier than that.
    At Apidima cave in Greece, there is a single skull of a modern human from 210,000 years ago. A jawbone from Misliya cave in Israel is at least 177,000 years old. There are some contentious remains from China that might be modern humans. “And there are debates swirling around the earliest colonisation of Australia,” says Eleanor Scerri at the Max Planck Institute of Geoanthropology in Germany. Some researchers claim people were on the continent 65,000 years ago.
    What is going on? Is our wealth of genetic data somehow misleading us? Or is it true that we are all descended from that last big migration – and the older bones represent populations that didn’t survive?
    Scerri and her colleagues have tried to find an explanation.
    African environments
    The team was discussing where modern humans lived in Africa. “Were humans simply moving into contiguous regions of African grasslands, or were they living in very different environments?” says Scerri.
    To answer that, they needed a lot of data.
    “We started with looking at all of the archaeological sites in Africa that date to 120,000 years ago to 14,000 years ago,” says Emily Yuko Hallett at Loyola University Chicago in Illinois. She and her colleagues built a database of sites and then determined the climates at specific places and times: “It was going through hundreds and hundreds of archaeological site reports and publications.”

    There was an obvious shift around 70,000 years ago. “Even if you just look at the data without any fancy modelling, you do see that there is this change in the conditions,” says Andrea Manica at the University of Cambridge, UK. The range of temperatures and rainfalls where humans were living expanded significantly. “They start getting into the deeper forests, the drier deserts.”
    However, it wasn’t enough to just eyeball the data. The archaeological record is incomplete, and biased in many ways.
    “In some areas, you have no sites,” says Michela Leonardi at the Natural History Museum in London – but that could be because nothing has been preserved, not because humans were absent. “And for more recent periods, you have more data just because it’s more recent, so it’s easier for it to be conserved.”
    Leonardi had developed a statistical modelling technique that could determine whether animals had changed their environmental niche: that is, whether they had started living under different climatic conditions or in a different type of habitat like a rainforest instead of a grassland. The team figured that applying this to the human archaeological record would be a two-week job, says Leonardi. “That was five and a half years ago.”
    However, the statistics eventually did confirm what they initially saw: about 70,000 years ago, modern humans in Africa started living in a much wider range of environments. The team published their results on 18 June.
    Jacks of all trades
    “What we’re seeing at 70,000 [years ago] is almost kind of our species becoming the ultimate generalist,” says Manica. From this time forwards, modern humans moved into an ever-greater range of habitats.
    It would be easy to misunderstand this. The team absolutely isn’t saying that earlier H. sapiens weren’t adaptable. On the contrary: one of the things that has emerged from the study of extinct hominins is that the lineage that led to us became increasingly adaptable as time went on.
    “People are in different environments from an early stage,” says Scerri. “We know they’re in mangrove forests, they’re in rainforest, they’re in the edges of deserts. They’re going up into highland regions in places like Ethiopia.”
    This adaptability seems to be how early Homo survived environmental changes in Africa, while our Paranthropus cousins didn’t: Paranthropus was too committed to a particular lifestyle and was unable to change.

    Instead, what seems to have happened in our species 70,000 years ago is that this existing adaptability was turned up to 11.
    Some of this isn’t obvious until you consider just how diverse habitats are. “People have an understanding that there’s one type of desert, one type of rainforest,” says Scerri. “There aren’t. There are many different types. There’s lowland rainforest, montane rainforest, swamp forest, seasonally inundated forest.” The same kind of range is seen in deserts.
    Earlier H. sapiens groups were “not exploiting the full range of potential habitats available to them”, says Scerri. “Suddenly, we see the beginnings of that around 70,000 years ago, where they’re exploiting more types of woodland, more types of rainforest.”
    This success story struck me, because recently I’ve been thinking about the opposite.

    Splendid isolation
    Last week, I published a story about local human extinctions: groups of H. sapiens that seem to have died out without leaving any trace in modern populations. I focused on some of the first modern humans to enter Europe after leaving Africa, who seem to have struggled with the cold climate and unfamiliar habitats, and ultimately succumbed. These lost groups fascinated me: why did they fail, when another group that entered Europe just a few thousand years later succeeded so enormously?
    The finding that humans in Africa expanded their niche from 70,000 years ago seems to offer a partial explanation. If these later groups were more adaptable, that would have given them a better chance of coping with the unfamiliar habitats of northern Europe – and for that matter, South-East Asia, Australia and the Americas, where their descendants would ultimately travel.
    One quick note of caution: this doesn’t mean that from 70,000 years ago, human populations were indestructible. “It’s not like all humans suddenly developed into some massive success stories,” says Scerri. “Many of these populations died out, within and beyond Africa.”
    And like all the best findings, the study raises as many questions as it answers. In particular: how and why did modern humans became more adaptable 70,000 years ago?
    Manica points out that we can also see a shift in the shapes of our skeletons. Older fossils classed as H. sapiens don’t have all the features we associate with humans today, just some of them. “From 70,000 [years ago] onwards, roughly speaking, suddenly you see all these traits present as a package,” he says.

    Manica suggests that the expansion into new niches may have enabled this, by bringing previously separate populations into more regular contact. For instance, if two populations were separated by a desert, they would never have met, never exchanged ideas and genes – until someone figured out how to live in the desert.
    “There might also be almost a positive feedback,” says Manica. “You connect a bit more, you become more flexible… You break down some of those barriers, you become even more connected.”
    With apologies, here is a pat conclusion. In that story about lost populations, I said that one of the biggest threats to human groups is isolation: if you don’t have neighbours you can call on and your group is small, even a minor misfortune can mean apocalypse. If Manica is right, the exact opposite played out in Africa. Populations grew and became more connected, and that enabled an explosion of creativity that sent our species all over the planet.
    In which case, the reason the last out-of-Africa migration succeeded so wildly is: people need people. Without other people, we’re stupid and doomed. Any doomsday preppers hoping to ride out the apocalypse alone in a well-provisioned bunker: you may have the wrong approach.

    Topics: More

  • in

    Researchers re-enact a 30,000 year old sea voyage

    Archaeological evidence shows that 30,000 years ago, Palaeolithic people travelled from the island now known as Taiwan to the southern islands of Japan. This voyage would have included crossing the Kuroshio, one of the world’s strongest ocean currents.
    Yousuke Kaifu at The University Museum of the University of Tokyo wanted to put this journey to the test, so his team built a dugout canoe using tools available to people at the time and set out from Taiwan. The journey spanned 225 kilometres and took the crew 45 hours before they reached Yonaguni Island. This trip came after previous failed attempts that used rafts made of reeds and bamboo.
    The success of the voyage gives some insight into how Palaeolithic people might have made the treacherous crossing.

    Topics:

    archaeology More

  • in

    ‘Hybrid’ skull may have been a child of Neanderthal and Homo sapiens

    The cranium of a girl thought to be the offspring of Neanderthal and Homo sapiens parentsIsrael Hershkovitz
    A 140,000-year-old hominin skull from Israel probably belonged to a hybrid child of Neanderthal and Homo sapiens parents, according to an analysis of its anatomy. The 5-year-old girl was buried within the earliest known cemetery, possibly reshaping what we know about the first organised burials and the humans behind them.
    The skull was originally unearthed from Skhul Cave on Mount Carmel in 1929. In total, these early excavations uncovered seven adults, three children and an assortment of bones that belonged to 16 hominins – all later assigned to Homo sapiens.

    The classification of the child’s skull, however, has been contested for nearly a century, partly because the jaw looks dissimilar to typical Homo sapiens mandibles. Original work hypothesised that it belonged to a transitional hominin called Paleoanthropus palestinensis, but later research concluded it most likely belonged to Homo sapiens.
    Anne Dambricourt Malassé at the Institute of Human Paleontology in France and her colleagues have now used CT scanning on the skull and compared it with other known Neanderthal children.
    “This study is maybe the first that has put the Skhul child’s remains on a scientific basis,” says John Hawks at the University of Wisconsin-Madison, who wasn’t involved in the new research. “The old reconstruction and associated work, literally set in plaster, did not really enable anyone to compare this child with a broader array of recent children to understand its biology.”

    Malassé and her colleagues found the mandible had distinct Neanderthal characteristics, while the rest of the skull was anatomically consistent with Homo sapiens. They conclude that this combination of features suggests that the child was a hybrid whose parents were different species.
    “I have long thought that hybridisations were not viable and I continue to think that they were mostly abortive,” says Malassé. “This skeleton reveals that they were nevertheless possible, even though this little girl lived only 5 years.”
    While the new work significantly advances our understanding of the important Skhul child skull, we can’t definitively identify the child as a hybrid without extracting its DNA, which researchers have not been able to do, says Hawks. “Human populations are variable and there can be a lot of variability in their appearance and physical form even without mixing with ancient groups like Neanderthals,” he says.
    We know from analyses of ancient and modern genomes that Homo sapiens and Neanderthals have swapped genes many times during the past 200,000 years. In 2018, a 90,000-year-old bone fragment found in Russia was identified as a hybrid between Neanderthals and Denisovans, another ancient hominin, using DNA analysis.

    The Levant may have been a particularly important region for mixing among hominin species, due to its position between Africa, Asia and Europe. The region has been characterised as a “central bus station” for humans living in the Pleistocene, says Dany Coutinho Nogueria at the University of Coimbra in Portugal.
    The new study forces us to call into question the attribution of the earliest grave site to Homo sapiens, says Malassé. This ritualised behavior may have come from Neanderthals, Homo sapiens or interactions between the two.
    “We do not know who buried this child, whether this place chosen to bury the corpse was that of a single community, or whether communities from different lineages, but which coexisted and established contacts or even unions, shared rites and emotions,” says Malassé.

    Topics:Neanderthals/ancient humans More

  • in

    Prehistoric Spanish people transported 2-tonne stone by boat

    The Matarrubilla stone at Valencina in Spain was transported more than 5300 years agoL. García Sanjuán
    A 2-tonne megalith in southern Spain was transported to its present location by a hitherto-unknown group of ancient seafarers over 5300 years ago.
    The Matarrubilla stone is a solid slab of gypsum about 1.7 metres long by 1.2 m wide, sitting within a tomb-like structure at the Copper Age site of Valencina, near Seville.

    It is located within a circular chamber called a tholos, with just enough room to stand around it. Given its unique composition and size, it is thought that the stone was used in rituals, but its provenance has been a mystery until now.
    Luis Cáceres Puro at the University of Huelva in Spain and his colleagues performed chemical analysis on the slab and optically stimulated luminescence dating – which approximates the last time light struck sediments – on the soil beneath it to better determine its age and site of origin.
    The results suggest the megalith was dragged to its current location between 4544 and 3277 BC, which is hundreds of years – possibly even 1000 years – earlier than previously thought. The new dates also suggest the rock was moved to Valencina long before the tunnel structure was built around it.

    The stone’s composition most closely matches a quarry 55 kilometres away on the other side of the Guadalquivir river. At the time, there was a wide estuary between the two sites, suggesting the stone must have been transported by boat.
    This is the first evidence of a megalithic stone being transported by boat in the Iberian peninsula, but large stones at other megalithic sites in Europe, such as Stonehenge in the UK and Carnac in France, are also thought to have been transported this way.
    “The 4th millennium BC saw rapid evolution in coastal navigation,” says team member Leonardo García Sanjuán at the University of Seville. “The Matarrubilla stone basin is a good piece of indirect evidence, which, in our opinion, proves that these people had advanced raft, canoe or sailing-boat technology.”
    Archaeological discoveries from other sites show that communities in the Mediterranean were already building sophisticated, seaworthy boats, he adds.

    “The crossing of the formerly existing sea with such a huge stone proves once again the technical savoir-faire of the Matarrubilla builders,” says Ramón Fabregas Valcarce at the University of Santiago de Compostela in Spain, who wasn’t involved in the study.
    Valencina is one of the largest prehistoric sites in Europe, covering an area of more than 460 hectares. Among the site’s rarer artefacts are materials imported from far-flung regions, including amber, flint, cinnabar, ivory and ostrich egg.
    “[Valencina] contains megalithic monuments, massive ditches, extensive burial records and refined material culture that reveals connections across Iberia, North Africa and the Mediterranean,” says Cáceres Puro.
    Prior work in the area has unearthed numerous details indicating the site’s historical significance, including a centuries-long period from 2900 to 2650 BC when it was largely ruled by women.
    “The current study adds intriguing further detail for one of Valencina’s major monuments,” says Alasdair Whittle at Cardiff University, UK.

    Topics:archaeology More