More stories

  • in

    Neanderthal groups had their own local food culture

    An illustration of a Neanderthal group preparing foodLUIS MONTANYA/MARTA MONTANYA/SCIENCE PHOTO LIBRARY
    Neanderthals may have had traditional ways of preparing food that were particular to each group. Discoveries from two caves in what is now northern Israel suggest that the residents there butchered the same kinds of prey in their own distinctive ways.
    Modern humans, or Homo sapiens, weren’t the first hominins to prepare and cook food. There is evidence that Neanderthals, for example, which inhabited Europe and Asia until about 40,000 years ago, used flint knives to butcher what they caught, cooked a wide range of animals and spiced up their menu with wild herbs.

    To learn more about Neanderthal food culture, Anaëlle Jallon at the Hebrew University of Jerusalem and her colleagues examined evidence at the caves of Amud and Kebara in northern Israel.
    These sites, which are just some 70 kilometres apart, provide a unique opportunity to examine local cultural differences. Stone tools, food remains and hearths found at each site reveal that Neanderthals occupied both caves, probably during winters, during the same time period.
    “You find the same species of animals to hunt and it’s more or less the same landscape,” says Jallon. “It will be the same kind of weather, and Neanderthals at both ate mostly gazelles and some fallow deer that they complemented with a few bigger animals like boar or aurochs.”

    There are a few differences, though. For example, bones reveal that a greater amount of large prey was hunted at Kebara, and more kills were carried back to that cave to be butchered.
    Jallon and her colleagues used microscopes to inspect bones from layers of sediment at the two sites from between 50,000 and 60,000 years ago, examining the cuts slashed in them with stone tools.
    They found that even though the flint tools used were similar at both sites, the patterns of cuts were different. “The cuts tend to be more variable in their width and depth in Kebara, and in Amud they are more concentrated in big clusters and they overlap each other more often,” says Jallon.
    To assess if the differences could be down to butchering different prey, the researchers also looked specifically at long bones from gazelles found at both sites. These had the same differences.
    “We are talking about two groups who live very close and, let’s say, both cutting up some beef – but in one site they seem to be cutting closer to the bone, getting all the meat off,” says Ceren Kabukcu at the University of Liverpool, UK.
    Previous research that looked at cut marks on bones from more recent societies suggests that the kind of variation seen in Neanderthal butchery isn’t down to a lack of expertise, but to a difference in technique.

    Jallon thinks the contrast is best explained by deliberate butchery choices. It could be that Neanderthals at Amud made their meat harder to process by, for example, drying it or letting it hang before cooking, she says, which would have meant they needed more cuts to get through it or a larger team of people to butcher the meat.
    “In behaviour that is as opportunistic as butchering, you would expect to find the most efficient way to butcher something to get the most out of it, but apparently, it was more determined by social or cultural factors,” says Jallon. “It could be due to group organisation or practices that are learned and transmitted from generation to generation.”
    “The fact that there might be differences and some nuance on how technology is used in daily life is not entirely shocking,” says Kabukcu. “I think as this question is investigated, we might see more and more nuance at several sites of the Middle Palaeolithic.”
    It isn’t known whether the caves were occupied at the same time or if disparate groups might have been in contact with each other. “It is a possibility that it was at the same exact time, but it’s also possible it was hundreds of years apart or more. We don’t have the resolution to know that,” says Jallon.
    But she also says that the pattern of very clustered cut marks found in Amud is similar in the oldest layer and in the younger layers, so she says the cave might have been used by returning groups that maintained the same butchery traditions for centuries.

    Topics:Neanderthals/ancient humans More

  • in

    In a first, an image shows a dying star exploded twice to become a supernova 

    For the first time, astronomers have spotted a star that exploded not once, but twice. A new image of a roughly 300-year-old supernova provides visual evidence that some dying stars undergo a double explosion, researchers report July 2 in Nature Astronomy.

    Supernovas usually mark the death of massive stars. But medium-sized ones, like the sun, can also go out with a bang. When midsize stars exhaust their hydrogen fuel, they shed everything but their core, leaving behind small inert objects called white dwarfs. These incredibly dense remnants are about the size of Earth with roughly the mass of the sun. More

  • in

    A newly discovered interstellar object might predate the solar system

    The solar system’s newest visitor, 3I/ATLAS, may be 3 billion years older than the sun and its planets.

    First discovered on July 1, 3I/ATLAS is a rare interstellar object — only the third ever spotted. Since then, astronomers have been racing to uncover its origins. A new calculation predicts that 3I/ATLAS originated from a part of the Milky Way called the thick disk. If so, there’s a two-thirds chance that it’s a comet over 7 billion years old. That would make it the oldest comet known, researchers reported July 11 at the Royal Astronomical Society’s National Astronomy Meeting 2025 in Durham, England. More

  • in

    Provocative new book says we must persuade people to have more babies

    A large population may enable innovation and economies of scalePHILIPPE MONTIGNY/iStockphoto/Get​ty Images
    After the SpikeDean Spears and Michael Geruso (Bodley Head (UK); Simon & Schuster (US))
    Four-Fifths of all the humans who will ever be born may already have been born. The number of children being born worldwide each year peaked at 146 million in 2012 and has been falling overall ever since. This means that the world’s population will peak and start to fall around the 2080s.
    This fall won’t be gradual. With birth rates already well below replacement levels in many countries including China and India, the world’s population will plummet as fast as it rose. In three centuries, there could be fewer than 2 billion people on Earth, claims a controversial new book.
    “No future is more likely than that people worldwide choose to have too few children to replace their own generation. Over the long run, this would cause exponential population decline,” write economists Dean Spears and Michael Geruso in After the Spike: The risks of global depopulation and the case for people.
    This, you might think, could be a good thing. Won’t it help solve many environmental issues facing us today? No, say the authors. Take climate change: their argument isn’t that population size doesn’t matter, but that it changes so slowly that other factors such as how fast the world decarbonises matter far more. The window of opportunity for lowering carbon dioxide emissions by reducing population has largely passed, they write.
    Spears and Geruso also make the case that there are many benefits to having a large population. For instance, there is more innovation, and economies of scale make the manufacture of things like smartphones feasible. “We get to have nice phones only because we have a lot of neighbors on this planet,” they write.
    So, in their view, our aim should be to stabilise world population rather than letting it plummet. The problem is we don’t know how, even with the right political will.

    As we grow richer, we are more reluctant to abandon career and leisure opportuntiies to have children

    While some government policies have had short-term effects, no country has successfully changed long-term population trends, argue the authors. Take China’s one-child policy. It is widely assumed to have helped reduce population growth – but did it? Spears and Geruso show unlabelled graphs of the populations of China and its neighbours before, during and after the policy was in place, and ask the reader which is China. There is no obvious difference.
    Attempts to boost falling fertility rates have been no more successful, they say. Birth rates jumped after Romania banned abortion in 1966, but they soon started to fall again. Sweden has tried the carrot rather than the stick by heavily subsidising day care. But the fertility rate there has been falling even further below the replacement rate.
    All attempts to boost fertility by providing financial incentives are likely to fail, Spears and Geruso argue. While people might say they are having fewer children because they cannot afford larger families, the global pattern is, in fact, that as people become richer they have fewer children.
    Rather than affordability being the issue, it is more about people deciding that they have better things to do, the authors say. As we grow richer, we are more reluctant to abandon career and leisure opportunities to have children. Even technological advances are unlikely to reverse this, they say.
    On everything other than the difficulty of stabilising the population, this is a relentlessly optimistic book. For instance, say the authors, dire predictions of mass starvation as the world’s population grew have been shown to be completely wrong. The long-term trend of people living longer and healthier lives can continue, they suggest. “Fears of a depleted, overpopulated future are out of date,” they write.
    Really? Spears and Geruso also stress that the price of food is key to determining how many go hungry, but fail to point out that food prices are now climbing, with climate change an increasing factor. I’m not so sure things are going to keep getting better for most people.
    This book is also very much a polemic: with Spears and Geruso labouring their main points, it wasn’t an enjoyable read. That said, if you think that the world’s population isn’t going to fall, or that it will be easy to halt its fall, or that a falling population is a good thing, you really should read it.

    New Scientist book club

    Love reading? Come and join our friendly group of fellow book lovers. Every six weeks, we delve into an exciting new title, with members given free access to extracts from our books, articles from our authors and video interviews.

    Topics: More

  • in

    Evolution has made humans both Machiavellian and born socialists

    David Oliete
    Nearly 2 million years ago, one of our hominin ancestors developed bone cancer in their foot. The fate of this individual is unknown, but their fossilised remains leave no doubt that cancer has been a part of our story for a very long time. It is also clear that, when threatened by our own normally cooperative cells turning against us, we evolved an immune system to help us identify and deal with the enemy within.
    But treacherous cancer cells weren’t the only internal threat our ancestors faced. As hypersocial beings, their survival was also jeopardised by selfish individuals attempting to subvert the group – and capable of unravelling society, just as a cancer eventually kills its host. I am interested in understanding how we adapted to this threat. At the heart of the story is this question: is human nature selfish or altruistic, competitive or cooperative? Are we essentially cancers, tamed by culture, or more like healthy cells in the human body, working together for mutual success?
    People have debated this for centuries and continue to do so, citing research in primatology, anthropology, psychology and economics to defend their points. The answer has profound implications for how we aim to structure society. If we are born altruists, then institutions and hierarchies are unnecessary. But if selfishness prevails, strong control is essential. To me, both extremes are unconvincing. To understand why, we must appreciate the circumstances under which humanity has evolved. Determining how our ancestors confronted brutish selfishness doesn’t just tell us about our own social past – it can also help us inoculate today’s societies against the threat from within.

    Look at the animals to which we are most closely related and you see that each species has its own distinct set of social structures. Among gorillas, for example, groups of unmated males are typically led by aggressive alpha males. Mated gorillas sometimes live in groups of males and females, but more often it is the stereotypical silverback with a harem of females – a group that has its own hierarchy. Chimpanzees and bonobos also display dominance hierarchies, with a lot of emphasis placed on female social rank, particularly among bonobos. Despite the wide variation in sociality among these species, one thing is consistent: where there is social rank, aggressive dominance is the winning attribute. If an alpha can successfully defend resources, whether territory, food or mates, it can dominate a primate society. Access to more resources translates into having more surviving offspring than others, which is the only measure of success for evolution by natural selection.
    Human self-domestication
    Among our own ancestors – members of the genus Homo – the story is different. Research in anthropology and primatology suggests that, as early people evolved more complex social structures, they did something unseen in other primates: they domesticated themselves. Once they had the cognitive sophistication necessary to create weapons, along with the intelligence required to form alliances, they could fight the large, angry dominants that ruled over their social groups. The primatologist Richard Wrangham at Harvard University argues that this profoundly shaped human society because, along with eliminating the alphas, our ancestors also selected against the human trait of aggression. As a result, humans became more cooperative, and their societies became more egalitarian.

    But if that is the case, how do we explain the undeniable and growing inequality in today’s societies, where huge amounts of power and money are concentrated among a small number of people, with the problem particularly pronounced in major economies such as the US, the UK and China? Some researchers argue that humans are not egalitarian by nature, but that living in small, close-knit groups of hunter-gatherers – as people did before the dawn of agriculture – suppressed our tendencies to form dominance-based hierarchies. They see a U-shaped curve of human egalitarianism. The point we started from – which looked a lot like the social structures we see in other great apes – is where we have ended up again, with the middle of the U showing a brief flirtation with social equality.
    If human nature were entirely cooperative then state control wouldn’t be required to prevent freeloadingZoonar GmbH/Alamy
    I agree that we aren’t naturally egalitarian. In fact, I am not convinced that human societies were ever egalitarian. As anthropologists point out, even living hunter-gatherers have some brutal practices, such as infanticide. But, for me, the explanation for our current unequal circumstances lies not in our ancestors having selected against aggression, but in how the elimination of despotic alpha males allowed other, arguably more insidious people to succeed.
    Once humanity was free of the strong grip of strict dominance hierarchies led by alpha males, success in social groups would have become more about skilful manoeuvring within communities. This led to the rise of a particular kind of social intelligence called Machiavellian intelligence, which entails the cunning manipulation of others. In the language of evolutionary biology, we have a cooperation dilemma: there are situations where it is in our interest to work with others, and there are situations where it is not. And, as anyone who has watched an episode of The Traitors will be aware, the need to pull together and the need to betray can come into conflict. As a result, overt rivalry was superseded by what I call “invisible rivalry” – the ability to hide selfish, competitive or exploitative intentions while maintaining the appearance of a cooperative nature. In other words, we evolved to compete in a cooperative world.
    The social brain
    Support for this idea comes from the size of the human brain. All primates have large brains relative to their body size, and ours is exceptionally big. The social brain hypothesis suggests that these large brains evolved to help individuals manage their unusually complex social systems. Of course, cooperation is part of this, but it can’t be the whole story. Consider ants, which, in terms of numbers and pervasiveness, are probably the most successful group of species on Earth. They are eusocial, which means they cooperate so fully that they seem to act as a single organism. Yet their brains are tiny, and everything they need to work together is programmed within them genetically. So, you don’t necessarily need a big brain to cooperate – but you do need one to compete strategically. Indeed, research suggests that social competition is what best explains the evolution of our enormous brain compared with the big brains of other primates.
    To paraphrase Aristotle, we are political animals – not merely social ones. We strategise within our societies to maximise our success, whether that is defined in terms of money, power, mating success, hunting prowess or any of the other qualities associated with prestige around the world. To do so effectively, we evolved to not just be smart enough to cooperate, but to cooperate selectively – and to betray others when it suits us, or even just when we can get away with it.

    Studies by economists and psychologists illustrate this. For example, in one set of experiments, participants were paired in a cooperation game in which one person was given $10 and the choice to share it with the other (or not). A lot of research shows that in these circumstances, people generally give some money to their partner, often splitting the pot equally, even when there is no obvious punishment for betraying them. But this time, the researchers gave some participants another option: they could take less money and leave the game without their partner ever knowing that they had been involved in a cooperation game. About one-third of participants took this option. It was as if they were happy to pay to have their betrayal left unknown.
    Experiments like this tell us a lot about the human psyche. In social interactions, we often need to be able to predict what others around us are going to do – to learn where to place trust effectively, to win trust when we need it and to hide betrayals of trust on our own part. These abilities require empathy, emotion, language and, perhaps, as some of my colleagues argue, consciousness. Yet those same abilities, and that same intelligence, have a dangerous downside. Our evolved proclivity for maximising resources leads us to exploit those around us – and some people are so effective at deception that they risk damaging their societies. Modern, extreme inequality is an example of this process in action. So too are past political upheavals leading to degradation of the rule of law – and sometimes the fall of civilisations. The Roman Republic, for example, collapsed because of a tremendous internal struggle for power, culminating in Julius Caesar’s Machiavellian machinations, eventually leading to autocracy.
    Religion is one institution that societies use to promote cooperationAdam Guz/Getty Images Poland/Getty Images
    So, our dual cooperative-competitive nature means that we face an enemy within that may bring down society. And this is where the analogy with cancer arises. Humanity’s long history of living with the disease means we have evolved biological mechanisms to reduce the risk it poses. Many reactions at the cellular level, including attacks by immune cells and programmed cell death, evolved to help our bodies fight off cancers, as well as other, external threats to our survival. It is this stalwart immune system that explains why, although mutations occur all the time and we are frequently exposed to viruses and bacteria, these often don’t lead to symptoms or illness. Similarly, the threats to our social groups posed by the evolution of invisible rivalry led us to develop practices, behaviours and institutions to maximise cooperation and thwart our Machiavellian tendencies. In my new book, Invisible Rivals: How we evolved to compete in a cooperative world, I call this our cultural immune system.
    Religion is one institution that can function in this way. Religious teaching can promote cooperation among everyone who practises it – and this is one possible reason that the Golden Rule, often summed up as “treat others as you would like to be treated”, is found independently in scriptures across the world. People who believe these scriptures – who internalise them, as anthropologists say – are more likely to help fellow members of their group.
    Everywhere anthropologists look, they find other practices and institutions that bolster cooperation at the local level. In cultures that rely on fishing, there are strong norms against over-fishing, which would deplete the stock for everyone. Where people are dependent on hunting, there are strict rules about how meat is shared and who gets credit for a kill. The Maasai people of Kenya and Tanzania have a relationship framework called osotua, rooted in need-based sharing partnerships and relying on mutual help in hard times. For example, if someone needs cattle because theirs have all died, another member of the group will help, not because they get anything directly in return, but simply because their neighbour’s needs are greater at that time. This creates a special bond – osotua translates as “umbilical cord” – and treachery is rare because the bond is seen as sacred.
    The Maasai people have a system called osotua whereby they give cattle to others in needSiegfried Modola/Getty Images
    Across the world, social norms that guide behaviours have evolved, and they have been refined over thousands of years of trial and error through cultural evolution. However, just as cancers find ways to evade our immune systems, so some individuals use their Machiavellian intelligence to subvert the group’s social norms for their own benefit. This is trickier to do in small-scale societies where people all know each other, making rule-breakers easier to detect and punish. But as societies grew over the past 10,000 years, so did opportunities to act selfishly. Agricultural networks, cities and, finally, nation states made deception much easier to pull off, because it is easy to cheat more people without getting caught in a group where it is impossible to know everyone personally.
    Taming our Machiavellian nature
    It is this lethal combination of opportunity and invisible rivalry that makes the question of whether humans are cooperative or competitive so relevant today. To fight the enemy within society, we need to understand that both traits are in our nature, and that we evolved to apply whichever suits us best. Thinking that we are either one or the other leaves us vulnerable to facile arguments about how we should structure society. If we are purely selfish, it follows that society should focus on heavy policing and punishment of freeloaders, including those in power. But believing that we are intrinsically altruistic is equally detrimental because it risks ignoring the threat posed by rampant self-interest.
    Suppressing humanity’s Machiavellian side is certainly harder in large-scale societies. But there are basic ways that we can boost the cultural immune system, much like how we can improve our biological immune systems through healthy lifestyles and vaccination. The key, I believe, is to learn more about the social norms that small-scale societies have evolved to help them thrive and stave off opportunistic cheaters and then use this knowledge to create policies that promote cooperation at a higher level. For example, within our own communities, we can look to cultures that promote systems like need-based transfers and others that have found ways to share resources more equitably.

    But this isn’t going to happen until we first recognise the problem that invisible rivalry poses. In my view, the best way to do that is through education. We are all part of the cultural immune system. If we understand our evolutionary heritage, we will be alert to the danger that freeloaders pose to society and place our trust more discerningly – much as the body’s defence system learns to recognise the agents associated with cancers and other diseases to deal with them. Crucially, we also need to recognise that cooperation is best for everyone in the long term.
    A small proportion of people at the most competitive end of the spectrum will always try to game society. We must work together to stay one step ahead of humanity’s opportunistic nature. Without beliefs, norms and a proper understanding of human nature, we are at the mercy of our selfish biological heritage. Evolution has made us this way, but we can learn to overcome it.

    Topics:psychology/human evolution More

  • in

    70,000 years ago humans underwent a major shift – that’s why we exist

    Ancient humans adapted to deeper forests as they migrated out of Africa and away from the savannahLIONEL BRET/EURELIOS/SCIENCE PHOTO LIBRARY
    This is an extract from Our Human Story, our newsletter about the revolution in archaeology. Sign up to receive it in your inbox every month.
    Humans come from Africa. This wasn’t always obvious, but today it seems as close to certain as anything about our origins.
    There are two senses in which this is true. The oldest known hominins, creatures more closely related to us than to great apes, are all from Africa, going back as far as 7 million years ago. And the oldest known examples of our species, Homo sapiens, are also from Africa.
    It’s the second story I’m focusing on here, the origin of modern humans in Africa and their subsequent expansion all around the world. With the advent of DNA sequencing in the second half of the 20th century, it became possible to compare the DNA of people from different populations. This revealed that African peoples have the most variety in their genomes, while all non-African peoples are relatively similar at the genetic level (no matter how superficially different we might appear in terms of skin colour and so forth).
    In genetic terms, this is what we might call a dead giveaway. It tells us that Africa was our homeland and that it was populated by a diverse group of people – and that everyone who isn’t African is descended from a small subset of the peoples, who left this homeland to wander the globe. Geneticists were confident about this as early as 1995, and the evidence has only accumulated since.
    And yet, the physical archaeology and the genetics don’t match – at least, not on the face of it.

    Genetics tells us that all living non-African peoples are descended from a small group that left the continent around 50,000 years ago. Barring some wobbles about the exact date, that has been clear for two decades. But archaeologists can point to a great many instances of modern humans living outside Africa much earlier than that.
    At Apidima cave in Greece, there is a single skull of a modern human from 210,000 years ago. A jawbone from Misliya cave in Israel is at least 177,000 years old. There are some contentious remains from China that might be modern humans. “And there are debates swirling around the earliest colonisation of Australia,” says Eleanor Scerri at the Max Planck Institute of Geoanthropology in Germany. Some researchers claim people were on the continent 65,000 years ago.
    What is going on? Is our wealth of genetic data somehow misleading us? Or is it true that we are all descended from that last big migration – and the older bones represent populations that didn’t survive?
    Scerri and her colleagues have tried to find an explanation.
    African environments
    The team was discussing where modern humans lived in Africa. “Were humans simply moving into contiguous regions of African grasslands, or were they living in very different environments?” says Scerri.
    To answer that, they needed a lot of data.
    “We started with looking at all of the archaeological sites in Africa that date to 120,000 years ago to 14,000 years ago,” says Emily Yuko Hallett at Loyola University Chicago in Illinois. She and her colleagues built a database of sites and then determined the climates at specific places and times: “It was going through hundreds and hundreds of archaeological site reports and publications.”

    There was an obvious shift around 70,000 years ago. “Even if you just look at the data without any fancy modelling, you do see that there is this change in the conditions,” says Andrea Manica at the University of Cambridge, UK. The range of temperatures and rainfalls where humans were living expanded significantly. “They start getting into the deeper forests, the drier deserts.”
    However, it wasn’t enough to just eyeball the data. The archaeological record is incomplete, and biased in many ways.
    “In some areas, you have no sites,” says Michela Leonardi at the Natural History Museum in London – but that could be because nothing has been preserved, not because humans were absent. “And for more recent periods, you have more data just because it’s more recent, so it’s easier for it to be conserved.”
    Leonardi had developed a statistical modelling technique that could determine whether animals had changed their environmental niche: that is, whether they had started living under different climatic conditions or in a different type of habitat like a rainforest instead of a grassland. The team figured that applying this to the human archaeological record would be a two-week job, says Leonardi. “That was five and a half years ago.”
    However, the statistics eventually did confirm what they initially saw: about 70,000 years ago, modern humans in Africa started living in a much wider range of environments. The team published their results on 18 June.
    Jacks of all trades
    “What we’re seeing at 70,000 [years ago] is almost kind of our species becoming the ultimate generalist,” says Manica. From this time forwards, modern humans moved into an ever-greater range of habitats.
    It would be easy to misunderstand this. The team absolutely isn’t saying that earlier H. sapiens weren’t adaptable. On the contrary: one of the things that has emerged from the study of extinct hominins is that the lineage that led to us became increasingly adaptable as time went on.
    “People are in different environments from an early stage,” says Scerri. “We know they’re in mangrove forests, they’re in rainforest, they’re in the edges of deserts. They’re going up into highland regions in places like Ethiopia.”
    This adaptability seems to be how early Homo survived environmental changes in Africa, while our Paranthropus cousins didn’t: Paranthropus was too committed to a particular lifestyle and was unable to change.

    Instead, what seems to have happened in our species 70,000 years ago is that this existing adaptability was turned up to 11.
    Some of this isn’t obvious until you consider just how diverse habitats are. “People have an understanding that there’s one type of desert, one type of rainforest,” says Scerri. “There aren’t. There are many different types. There’s lowland rainforest, montane rainforest, swamp forest, seasonally inundated forest.” The same kind of range is seen in deserts.
    Earlier H. sapiens groups were “not exploiting the full range of potential habitats available to them”, says Scerri. “Suddenly, we see the beginnings of that around 70,000 years ago, where they’re exploiting more types of woodland, more types of rainforest.”
    This success story struck me, because recently I’ve been thinking about the opposite.

    Splendid isolation
    Last week, I published a story about local human extinctions: groups of H. sapiens that seem to have died out without leaving any trace in modern populations. I focused on some of the first modern humans to enter Europe after leaving Africa, who seem to have struggled with the cold climate and unfamiliar habitats, and ultimately succumbed. These lost groups fascinated me: why did they fail, when another group that entered Europe just a few thousand years later succeeded so enormously?
    The finding that humans in Africa expanded their niche from 70,000 years ago seems to offer a partial explanation. If these later groups were more adaptable, that would have given them a better chance of coping with the unfamiliar habitats of northern Europe – and for that matter, South-East Asia, Australia and the Americas, where their descendants would ultimately travel.
    One quick note of caution: this doesn’t mean that from 70,000 years ago, human populations were indestructible. “It’s not like all humans suddenly developed into some massive success stories,” says Scerri. “Many of these populations died out, within and beyond Africa.”
    And like all the best findings, the study raises as many questions as it answers. In particular: how and why did modern humans became more adaptable 70,000 years ago?
    Manica points out that we can also see a shift in the shapes of our skeletons. Older fossils classed as H. sapiens don’t have all the features we associate with humans today, just some of them. “From 70,000 [years ago] onwards, roughly speaking, suddenly you see all these traits present as a package,” he says.

    Manica suggests that the expansion into new niches may have enabled this, by bringing previously separate populations into more regular contact. For instance, if two populations were separated by a desert, they would never have met, never exchanged ideas and genes – until someone figured out how to live in the desert.
    “There might also be almost a positive feedback,” says Manica. “You connect a bit more, you become more flexible… You break down some of those barriers, you become even more connected.”
    With apologies, here is a pat conclusion. In that story about lost populations, I said that one of the biggest threats to human groups is isolation: if you don’t have neighbours you can call on and your group is small, even a minor misfortune can mean apocalypse. If Manica is right, the exact opposite played out in Africa. Populations grew and became more connected, and that enabled an explosion of creativity that sent our species all over the planet.
    In which case, the reason the last out-of-Africa migration succeeded so wildly is: people need people. Without other people, we’re stupid and doomed. Any doomsday preppers hoping to ride out the apocalypse alone in a well-provisioned bunker: you may have the wrong approach.

    Topics: More

  • in

    Researchers re-enact a 30,000 year old sea voyage

    Archaeological evidence shows that 30,000 years ago, Palaeolithic people travelled from the island now known as Taiwan to the southern islands of Japan. This voyage would have included crossing the Kuroshio, one of the world’s strongest ocean currents.
    Yousuke Kaifu at The University Museum of the University of Tokyo wanted to put this journey to the test, so his team built a dugout canoe using tools available to people at the time and set out from Taiwan. The journey spanned 225 kilometres and took the crew 45 hours before they reached Yonaguni Island. This trip came after previous failed attempts that used rafts made of reeds and bamboo.
    The success of the voyage gives some insight into how Palaeolithic people might have made the treacherous crossing.

    Topics:

    archaeology More