More stories

  • in

    How a surge in ancient plagues 5000 years ago shaped humanity

    Simon Pemberton
    Disease historians have a problem. While examining samples of ancient human DNA, geneticists have come across genes belonging to the plague bacterium, Yersinia pestis, revealing that it ravaged Eurasia 5000 years ago. That’s nearly 3500 years before the “first plague”, also known as the Justinian plague, after the Roman emperor of the day. What to call this newly discovered prequel?
    The current favourite, the Late Neolithic-Bronze Age (LNBA) plague, is a bit of a mouthful. But the scientists have more to worry about. Their chance discovery is another nail in the coffin of a long-held idea about when and why humanity acquired many of the contagious diseases that now afflict us. Of late, they have uncovered a rogue’s gallery of prehistoric horrors in samples taken from ancient humans. These so-called zoonotic diseases bothered animals before they bothered people, so were thought to have jumped the species barrier after humans invented agriculture, around 12,000 years ago. But as geneticists can peer further back into the past, they are finding that in many cases the leap occurred much later – with major outbreaks happening in Europe, you’ve guessed it, around 5000 years ago.
    As well as upending old ideas about disease evolution, the discovery has forced a rethink of a pivotal period in prehistory. How were diseases spreading at that time? Did the pathogens have the same effects as they do now? And might plague itself have ushered in the Bronze Age, laying the foundations of European civilisation? It’s exciting stuff, says archaeogeneticist Megan Michel at Harvard University, given that a decade ago, “we didn’t even know this plague existed”.

    The reconstruction of ancient disease landscapes has been a huge collaborative effort, but a group at the University of Copenhagen in Denmark has had a leading role. They began routinely screening ancient human remains for known pathogens about 15 years ago, having unexpectedly found microbial DNA in human samples. Armed with radiocarbon dates and information about how people in prehistoric cemeteries were related to each other, they could start to build a picture of the cultural and economic context in which the diseases spread. They could also track the evolution of pathogens over time – and investigate how the human immune system adapted in turn.
    This approach has generated a quickfire sequence of important findings, including the discovery of pathogens that cause typhoid, hepatitis B, syphilis and smallpox in historical human populations – and culminated this July in the publication of a study led by population geneticist Martin Sikora, a member of the Copenhagen group. His team re-analysed around 1300 human samples spanning more than 35,000 years in Eurasia. All the DNA came from teeth, which preserve blood-borne pathogens because they have their own blood supply in life. Among the pathogens the researchers found were Y. pestis and the bacteria that cause leprosy and leptospirosis, or Weil’s disease. To their surprise, nearly 3 per cent of samples tested positive for another pathogen, Borrelia recurrentis, the causative agent of the now-rare relapsing fever – a relative of Lyme disease characterised, as its name would suggest, by recurring fever and blinding headaches.

    The team also looked at trends over time. These revealed that until about 6500 years ago, the vast majority of microbes in the teeth of Eurasians belonged to the oral microbiome – the diverse, usually harmless or even beneficial community of organisms that inhabits the mouth. The first zoonotic pathogens, including plague, became detectable at that date, but only at very low levels. It wasn’t until around 5000 years ago that there was a spike in infections from Y. pestis and other major pathogens.
    Arrival of the Yamnaya
    This also happens to be the date that nomadic herders called the Yamnaya began arriving in Europe from the steppe, a vast expanse of grasslands and savannas spread across much of Eurasia, bringing new ideas and new languages. Coincidence? The researchers think not. Those herders had an exceptionally high burden of infectious disease. It isn’t clear why, but it was probably linked to their lifestyle. They kept much larger herds than static farmers – of sheep, goats, horses and cattle – and they lived with their animals around the clock. Their diet consisted mainly of meat and milk. “A lot of zoonoses can be transmitted through undercooked meat, but also through milk: brucellosis, listeriosis, bovine tuberculosis, to name just a few,” says infectious disease expert Astrid Iversen at the University of Oxford.
    Plague doctors treated victims of bubonic plague during outbreaks in EuropeScience History Images/Alamy
    Other findings appear to corroborate this hunch. For instance, by tracing how the genome of plague bacteria changed over time, archaeogeneticist Pooja Swali at University College London has been able to show that 4000-year-old cases of plague – which were the oldest known in Britain when she documented them in 2023 – were caused by strains related to those carried earlier out of the steppe. She could effectively see the disease moving from east to west.
    Then there is relapsing fever. Earlier this year, Swali reported that B. recurrentis became specialised to humans in a window centring on 5000 years ago. Before that, the bacterium infected a range of mammals via the tick, its intermediate host, but then it swapped this out for the human body louse. Swali speculates this had to do with wool clothing, another innovation – besides metal tools – brought to Europe by the steppe nomads. B. recurrentis underwent a major reduction of its genome at that time, which could reflect adaptation to a new host – one that flourished in wool garments. “Maybe this massive reduction in genome meant that it became trapped in lice,” she says.
    Meanwhile, French researchers have shown that the immune system of Europeans began adapting to infectious diseases like these around 6000 years ago, with the bulk of immunity-related genetic variants appearing around 4500 years ago. “All these pieces fit really nicely together,” says Sikora.
    But there’s one piece that doesn’t fit so well. Sikora’s July paper cites two cases of plague in Orkney, off the north coast of Scotland, that predate the arrival of people with steppe ancestry in Britain by at least 500 years. What’s more, last year, another member of the Copenhagen group, Frederik Seersholm, described three outbreaks of plague over six generations of Neolithic Swedish farmers that occurred around 5000 years ago. Those farmers carried no steppe ancestry, indicating that they had yet to interbreed with – perhaps even to meet – these populations of eastern origin. A new study from Seersholm and Ruairidh Macleod at UCL, which has yet to be peer-reviewed, describes the oldest instances of plague in the world recorded to date, from around 3500 BC, which proved fatal to hunter-gatherers living near Siberia’s Lake Baikal, east of the Yamnaya’s point of departure towards the west.
    Such cases have persuaded most people that plague was geographically widespread before the nomads arrived. One idea is that the LNBA plague got its foothold in the mega-settlements of the Trypillia culture of present-day Ukraine, beginning around 6000 years ago, and then spread through trade networks. Archaeogeneticist Nicolás Rascovan at the Pasteur Institute in Paris, who suggested this possibility in 2019, says his hypothesis remains on the table, though he admits it is difficult to test because almost no Trypillian burials have been found. Others are sceptical. A team led by anthropologist Alex Bentley at the University of Tennessee, Knoxville, has shown that the clustered layout of Trypillian megasites could have introduced effective firebreaks to contagion. Besides, the Baikal cases indicate plague was a problem for hunter-gatherers from an early date.
    Plague without the fleas
    What the disease was like back then is also unclear, but there is no doubt it could kill. “Whether it was as highly transmissible as the Black Death, I’d be more cautious,” says Sikora. It is unethical to try to revive ancient plague strains in the lab, but you can get a rough idea by comparing ancient plague genomes with later strains that have known clinical outcomes. Such analysis has revealed that LNBA strains lacked a genetic variant that allowed the bacterium to survive in the flea gut, leading researchers to conclude that they probably weren’t transmitted by flea bites, as the Black Death was in the 14th century.
    Human body lice proliferated around 5000 years ago and became a vector for relapsing feverMARTIN OEGGERLI/SCIENCE PHOTO LIBRARY
    There are many other ways plague could have spread in the Late Neolithic, though. Macleod and Seersholm suggest it was airborne and spread through coughing. But we can’t assume it was capable of human-to-human transmission. Another possibility is that outbreaks were caused by people sharing feasts of undercooked, infected meat – in which case, each outbreak was an animal-to-human spillover event that probably fizzled out quickly. Plague has many animal reservoirs, including sheep, dogs and rodents, and researchers know very little about how prevalent it was in other species in the Late Neolithic, or how it evolved in them. “What’s missing is this huge piece of the puzzle – the animals,” says Swali.
    Amid all the uncertainty, arguably the most burning question is whether the plague caused the so-called Neolithic decline, a dramatic fall in the population of western Eurasia. If so, it might also have ushered in the Bronze Age in that part of the world, a cultural revolution that introduced a more hierarchical and warlike social model – perhaps by clearing the way for those nomadic steppe herders who organised themselves in that way.
    The Neolithic decline
    Neolithic farmers lived in denser, more permanent settlements than herders or hunter-gatherers, and lots of people living in proximity certainly lend themselves to contagion. Seersholm thinks his study of Swedish farmers supports the idea that plague caused their decline. However, archaeological evidence – the thinning of the farmers’ cultural footprint, signs of violence and the regrowth of forests – suggests it began around 7000 years ago, 500 years before the first zoonoses appeared in Europe. “I retain my scepticism that plague is responsible for this population downturn,” says archaeologist Stephen Shennan at UCL. He thinks the root cause was an agricultural crisis – shrinking crop yields related to a cooling climate. Nevertheless, he says he might have to change his mind if earlier plague cases come to light.
    That is possible. Geneticists are confident that the prehistoric prevalence of infectious disease was much higher than is detectable, in part because a disease can kill without showing up in the patient’s blood. This is the case for tuberculosis, for example, but also for the pneumonic form of plague, which infects the lungs. RNA viruses such as flu and coronaviruses aren’t yet detectable, either. Researchers are already searching for more evidence that Neolithic communities cratered as a direct consequence of plague. And one of them, archaeologist Kristian Kristiansen at the University of Copenhagen, thinks they will find it.
    Whether or not the LNBA plague caused the decline, it could have exacerbated it – especially after the arrival of the Yamnaya. Kristiansen doubts that their expansion into Europe was driven by plague – he prefers the theory that population growth forced them to go in search of new pastures. But, he says, they might have picked up plague en route, to which their lifestyle offered them at least partial immunity, and then spread those strains far and wide. Their contact networks extended much further than those of farmers. “You can see it clearly in the human DNA,” says bioarchaeologist Thomas Booth at London’s Francis Crick Institute. “Suddenly, after 3000 BC, there are biological ties stretching right across Eurasia where previously they had been more confined to smaller regional clusters.”And, of course, plague wasn’t the only disease to have a major impact. “One of the big takeaways for me, from the Sikora paper, is that around 10 per cent of the tested remains had positive evidence for a major infection at time of death,” says one co-author, evolutionary biologist Evan Irving-Pease at the University of Copenhagen. “The level of evolutionary pressure that would have exerted on ancient human populations is really quite substantial.” He and others believe that, in today’s more hygienic environment, variants of genes that were selected because they protected our ancestors from zoonotic disease predispose us to a different threat – autoimmune diseases such as multiple sclerosis (MS).
    Yamnaya steppe nomads spread across Europe at just the time when animal-borne diseases proliferatedPiotr Włodarczak
    Last year, with William Barrie at the University of Cambridge and others, Irving-Pease reported that a major genetic risk factor for MS tracks with steppe ancestry in Europe, being highest in the north of the region and lowest in the south. MS can be triggered by infection with the common Epstein-Barr virus today, but a different dangerous pathogen, prevalent in the Bronze Age, might initially have driven selection for that risk factor. Irving-Pease doesn’t know what it was, but with Iversen and others, he is hot on its trail.
    And the Late Neolithic disease surge may have shaped more than the immune system. Before then, Europeans didn’t practise dairying and were mostly lactose intolerant – unable to digest the sugar in milk. One surprising discovery is that the Yamnaya were, too: they probably consumed milk in fermented form – as yoghurt, kefir or cheese – and unwittingly recruited free-living bacteria to digest the lactose for them. So they didn’t bring Europeans the genes that allow us to do this for ourselves. Instead, research hints, these variants may have increased in frequency when bouts of disease and associated famines forced Neolithic farmers to drink milk to survive.

    Disentangling these complex biological and cultural interactions has implications for the future. Researchers may be close to uncovering the origins of MS, for example, but they can’t yet explain why it is becoming more prevalent over time. And zoonoses continue to pose a threat, accounting for an estimated three-quarters of emerging human diseases, including covid-19 – often because of our industrial-scale farming practices, destruction of forests and alteration of the climate. Understanding how they shaped us in the past will help us predict what lies ahead – and, potentially, to intervene with the powerful tools of modern medicine.
    For the moment, though, it is the prospect of shedding light on our past that excites researchers most. “We can start to ask more interesting questions about the role of pathogens in human prehistory,” says Michel. Infectious disease has been called “the loudest silence in the archaeological record”. Finally, we are dialing up the volume.

    Topics: More

  • in

    Neanderthal-human hybrids may have been scourged by a genetic mismatch

    A model of a Neanderthal womanJoe McNally/Getty
    Modern humans may indeed have wiped out Neanderthals – but not through war or murder alone. A new study suggests that when the two species interbred, a slow-acting genetic incompatibility increased the risk of pregnancy failure in hybrid mothers. A similar mismatch between mothers and fetuses may also help explain a subset of pregnancies that fail today.
    We know from genetic studies that there was sustained interbreeding between Homo sapiens and Neanderthals between approximately 50,000 and 45,000 years ago. The Neanderthals went extinct around 41,000 years ago, but some of their DNA has persisted in modern humans with non-African ancestry, making up around 1 to 2 per cent of the genome.

    But mysteriously, none of the mitochondrial DNA in modern humans is derived from Neanderthals. This form of DNA is carried by egg cells but not sperm, so it is always inherited from the mother.
    Patrick Eppenberger at the University of Zurich, Switzerland, and his colleagues have proposed a possible explanation for this. They suggest that women with Neanderthal and H. sapiens parents would have had a higher risk of pregnancy failure because of a mismatch between their genes and those of their fetus.
    Neanderthals and H. sapiens had different versions of PIEZO1, a gene critical to oxygen transport in the blood. The researchers analysed modern human and Neanderthal DNA and modelled the differences in the PIEZO1 protein to understand how the two variants would have interacted. They also studied human red blood cells in the lab, using a chemical treatment to simulate the effect of the Neanderthal variant.

    They found that the Neanderthal variant, V1, results in red blood cells that bind oxygen more strongly compared with the H. sapiens variant, V2. V1 is dominant, so a person who inherited both V1 and V2 would have red blood cells with this high oxygen affinity.
    This means that a fetus resulting from Neanderthals and H. sapiens interbreeding could have developed healthily in either a Neanderthal or H. sapiens mother. But according to the study, problems would have arisen in the next generation. A hybrid mother with V1 and V2 carrying a fetus with two copies of V2 would have had higher oxygen affinity than her fetus, so she would deliver less oxygen across the placenta. This might impair the growth of the fetus and increase the risk of pregnancy loss.
    Eppenberger and his colleagues declined to be interviewed, but in a paper they argue that this incompatibility would have led to the Neanderthal population experiencing a drain on its reproductive output. “Over millennia of coexistence, even low levels of gene flow from modern humans into Neanderthal populations could have introduced a gradual reproductive disadvantage, compounding over generations,” they write.
    It wouldn’t be such a problem for the H. sapiens population because it was much larger, the team suggests. Neanderthal DNA could spread through the population via fathers, but the V1 variant would quickly be eliminated by natural selection. This could explain why Neanderthal nuclear DNA persisted in modern humans, while mitochondrial DNA, inherited only through mothers, didn’t.
    Although not derived from Neanderthal DNA, the researchers also note that some mutations in PIEZO1 with a similar effect do occur today, and could cause some cases of unexplained pregnancy loss through a similar mismatch between mother and fetus.

    Sally Wasef at the Queensland University of Technology in Brisbane, Australia, says the discovery of the delayed, second-generation incompatibility is a “good insight”. “Even a minor hit to reproduction can push small groups below replacement, which can start a slide in numbers and, in fragile settings, an extinction spiral,” she says.
    “That being said, I would treat this finding as one piece of the puzzle rather than the whole story,” she says. “The effect is likely to be modest and to add to other ecological and social pressures.”
    Laurits Skov at the University of Copenhagen in Denmark says there were probably multiple factors involved in the Neanderthals’ demise, including changes in climate, the arrival of modern humans, the small community size of Neanderthals, the introduction of new diseases and genetic incompatibilities.
    Skov also says he would be surprised if this difference in oxygen affinity were determined by a single mutation in the PIEZO1 gene, as the researchers suggest.
    “I think more work is needed to conclusively say what the impact of this particular mutation is – and what happens when the mother and fetus have different configurations,” he says. “Or what role, if any, did this mutation play in the extinction of Neanderthals.”

    Neanderthals, ancient humans and cave art: France

    Embark on a captivating journey through time as you explore key Neanderthal and Upper Palaeolithic sites of southern France, from Bordeaux to Montpellier, with New Scientist’s Kate Douglas.

    Find out more

    Topics: More

  • in

    Ancient lead exposure may have influenced how our brains evolved

    Homo sapiens may have evolved to be more tolerant of lead exposure than other hominidsfrantic00/Shutterstock
    Prehistoric hominids have been exposed to poisonous lead for at least 2 million years, a study of fossil teeth suggests, and modern humans may have evolved to cope with the toxic metal better than our ancient relatives.
    Lead poisoning has long been thought to be a uniquely modern problem tied to industrialisation, poor mining practices and its use as an additive in fuel, which has been phased out since the 1980s.

    It is particularly dangerous for children, impacting their physical and mental development, but it can also cause a range of severe physical and psychological symptoms in adults.
    Renaud Joannes-Boyau at Southern Cross University in Lismore, Australia, and his colleagues wanted to find out whether our ancient relatives were also exposed to lead.
    They analysed 51 fossil teeth from hominids including Australopithecus africanus, Paranthropus robustus, Gigantopithecus blacki, Homo neanderthalensis and Homo sapiens. The fossils were from Australia, South-East Asia, China, South Africa and France.

    The scientists searched for lead signals in the teeth using laser ablation, which revealed bands of lead taken up by the teeth during periods of exposure when the hominids were still growing. This exposure could have come from environmental sources such as contaminated water, soil or volcanic activity.
    Joannes-Boyau says the team was particularly struck by the amount of lead in the teeth of Gigantopithecus blacki, an ancient giant relative of today’s orangutans that lived in what is now China. “If it was a modern human that had this amount of lead in their body, then I would say this person was facing high exposure from industry or anthropogenic activities,” he says.
    Next, the team investigated if there was any difference between the way modern humans coped with lead compared with Neanderthals. Using lab-grown models of the brain, called organoids, they studied both the Neanderthal and human versions of a gene called NOVA1 and tested the neurotoxicity of lead to the organoids.
    “What we see is modern NOVA1 is much less stressed by the neurotoxicity of lead,” says Joannes-Boyau.

    Most importantly, when organoids with archaic NOVA1 were exposed to lead, another gene called FOXP2 was severely disrupted.
    “These genes are linked to cognition; they’re linked to language and linked to social cohesion,” says Joannes-Boyau. “And it is less neurotoxic for modern humans than it is for Neanderthals, which would have given a very big advantage to Homo sapiens and implies lead has played a role in our evolutionary journey.”
    But Tanya Smith from Griffith University in Brisbane, Australia, isn’t convinced about the extent of lead exposure or whether work on organoids can be extrapolated to an evolutionary advantage for modern humans.
    “This is a really complex paper that makes some highly speculative claims,” says Smith. “While it is no surprise to me that wild primates and ancient hominins were exposed to lead naturally, as we’ve published in multiple papers over the last seven years, the limited distribution, number and type of fossils included simply does not demonstrate that human ancestors were consistently exposed to lead over 2 million years.”

    Neanderthals, ancient humans and cave art: France

    Embark on a captivating journey through time as you explore key Neanderthal and Upper Palaeolithic sites of southern France, from Bordeaux to Montpellier, with New Scientist’s Kate Douglas.

    Find out more

    Topics:human evolution/ancient humans More

  • in

    Early hominin had human-like dexterity and gorilla strength

    A model of Paranthropus boisei at the Museum of Human Evolution in Burgos, SpainCro Magnon / Alamy
    A pair of hands belonging to an enigmatic ancient hominin that lived around 1.5 million years ago has been found for the first time, revealing that they had gorilla-like strength alongside the dexterity to make tools.
    Paranthropus boisei was first discovered by archaeologist Mary Leakey in 1959 at Olduvai Gorge, Tanzania. The skull was found alongside a type of stone tool known as Oldowan and it was claimed the species was the oldest known maker of stone tools. But because no hand fossils had been found, anthropologists couldn’t be sure that P. boisei had made them.

    Now, Leakey’s granddaughter Louise Leakey, a palaeontologist at Stony Brook University in New York, and her colleagues have reported the discovery of a partial skeleton of a P. boisei individual from a site near Lake Turkana, Kenya. It includes a pair of hands, a skull and some feet bones from an individual thought to be male.
    Compared with earlier hominin species, this hand has more human-like proportions and straighter fingers, says team member Carrie Mongle, also at Stony Brook University. The hand is pretty “similar in size to my own, but much more robust”, she says.
    It has features similar to both modern humans and gorillas: for example, the thumb and finger bones are similarly proportioned to our hands, whereas other parts of the hand have much more robust bones, indicating the strength of a gorilla.

    “Shaking hands with this individual would have been noticeably different than shaking hands with your average human,” says Mongle. “They would have been much stronger.”
    It is now known that earlier hominins made tools, meaning P. boisei wasn’t the earliest. But Mongle says there has always been hesitation to attribute any Oldowan tools to Paranthropus because the remains of Homo habilis, a closer relative of modern humans, are often found in the same area.
    “While we don’t have any tools from this new site, the hand shows Paranthropus boisei could have formed precision grips similar to ours,” she says.
    Julien Louys at Griffith University in Brisbane, Australia, says the find helps fill in some of the many holes in our knowledge of the anatomy of ancient hominins in general, and Paranthropus in particular. He says the most surprising aspect of the discovery wasn’t the similarity to human hands, but the parts of the fossil that were similar to gorilla hands.

    “Behaviourally, they must have had some parallels with gorillas, but obviously also retaining some of our own behaviours,” says Louys.
    “Getting a fairly complete hand of a hominin is incredibly rare. The evidence suggests that Paranthropus likely used tools, or at least had the right hardware to use tools, which isn’t too unexpected,” he says.
    “I’m sure some people will continue to argue that only our own genus was capable of making more sophisticated lithics [stone tools], and short of finding a hand of Paranthropus clutching an Oldowan artefact, the debate will likely continue,” says Louys.
    Even gorillas are known to bang rock-like objects together as tools, says Amy Mosig Way at the Australian Museum in Sydney, so it is no surprise that P. boisei appears to have been capable of this. The more important question is whether stone tool-making was part of their behavioural repertoire, she says.
    “What we can say at this stage is that it’s theoretically possible they did perform freehand percussion [hitting stones together],” she says.

    Neanderthals, ancient humans and cave art: France

    Embark on a captivating journey through time as you explore key Neanderthal and Upper Palaeolithic sites of southern France, from Bordeaux to Montpellier, with New Scientist’s Kate Douglas.

    Find out more

    Topics:human evolution/ancient humans More

  • in

    Who were the first humans to reach the British Isles?

    Homo heidelbergensis on the ancient banks of the river Thames in modern-day Swanscombe, UKNATURAL HISTORY MUSEUM, LONDON/SCIENCE PHOTO LIBRARY
    This is an extract from Our Human Story, our newsletter about the revolution in archaeology. Sign up to receive it in your inbox every month.
    When we think about tricky places for humans to live, our minds tend to go to the most extreme places: the Sahara, the high Arctic, the peaks of the Himalayas. The British Isles are not quite as inhospitable as these places, but they still represented a considerable challenge for ancient humans.

    This was brought home to me when I came across a study from September of some of the earliest evidence of hominins living in Britain. The occupation it documents is truly ancient, over 700,000 years old. But this is comparatively recent when you consider how early hominins found their way out of Africa. These early explorers were quick to go to, for instance, Indonesia, and slow to go to Britain.
    Let’s put some concrete numbers on this. There were hominins living in Africa as early as 6 or 7 million years ago. Yet the oldest widely accepted evidence of hominins outside of Africa is from 1.8 million years ago, at Dmanisi in Georgia, where bones of Homo erectus have been found. These early members of our genus, it seems, were the first to wander more widely, ultimately reaching Java in Indonesia.
    Yet all the evidence of hominins in Britain is from the last million years. That’s a delay of hundreds of thousands of years.

    The delay may actually be even longer, because there are researchers who think hominins were living outside of Africa significantly earlier than this. At Xihoudu in China, stone tools were found in river gravels dated to 2.43 million years ago. Artefacts at Shangchen, on a Chinese plateau, were dated to 2.12 million years ago. In the last five years, I’ve written about stone tools from Jordan that may be over 2 million years old, and artefacts from India that are seemingly 2.6 million years old. All of these are disputed, the main issue being whether the objects are really human-made tools or just rocks that look like them after being bashed around by animals or carried down a fast-flowing river. But the examples are racking up, and I wouldn’t be surprised if something more definitive turns up in the near future.
    Either way, it seems it took our ancient relatives a while to settle on Britain.
    Goodbye blue skies
    Or maybe they got here really early, took one look, and turned back without leaving a trace. Britain’s climate may be mild in the sense that it rarely sees true extremes of heat or cold, but the gloominess and frequent rain are their own special kind of challenging.
    I vividly remember discussing the British climate with Nina Jablonski at Pennsylvania State University, who told me Britain has “a punishingly low and highly seasonal UV regime”. In other words, it’s incredibly cloudy. Unless you go into the far polar regions, where the sun doesn’t rise for months at a time, it’s hard to find anywhere that gets less sunlight.
    And that’s in today’s climate. There were times when it was colder. Ever since the beginning of the Pleistocene Epoch 2.58 million years ago, the climate has seesawed up and down, alternating between cold glacial periods and warmer interglacials. We’ve been in an interglacial for 11,700 years, but during the glacials the polar ice sheets expanded south and covered much of Britain.
    Our evidence of ancient humans in Britain has been primarily from the warmer interglacial periods – but the recent study changes that.
    It focuses on excavations at Old Park, next to the city of Canterbury in south-east England. In the 1920s, there was a quarry in Old Park called Fordwich Pit, where hundreds of stone tools were unearthed. Since 2020, Alastair Key at the University of Cambridge has been leading excavations in the area.
    In 2022, Key and his team published their initial findings, describing 112 artefacts that came from levels known to be at least 513,000-570,000 years old. My colleague Jason Arunn Murugesu wrote about this at the time, noting the artefacts were “the oldest of their kind known from the UK and among the earliest known in Europe”.
    Three years on, Key’s team has expanded the excavation and discovered even older sediments containing stone artefacts. Hominins seem to have been there between 773,000 and 607,000 years ago.
    For context, there was a warm interglacial between about 715,000 and 675,000 years ago. Before and after that, the climate turned cold.
    The team also found two more recent layers containing artefacts, which they dated to 542,000 and 437,000 years ago. Both fall smack into cold glacials.
    The implication is that hominins occupied and reoccupied Old Park several times, including during the glacial periods when the British climate was at its harshest.

    Ancient footprints discovered at Happisburgh in the UKSimon Parfitt
    Into the north
    Let’s put this into a wider context. Old Park isn’t quite the oldest evidence of hominins in the British Isles, although it’s close. And the oldest-known evidence isn’t actually there anymore.
    In 2013, researchers walking along a beach at Happisburgh in east England came upon 49 footprints. They had been preserved in layers of silt, which had been revealed by severe erosion. The footprints washed away within weeks, but archaeologists were able to document them and show that they were between 850,000 and 950,000 years old.
    Happisburgh has also yielded stone tools from over 780,000 years ago, and the nearby site of Pakefield had stone tools that are about 700,000 years old. However, the oldest-known hominin bones – as opposed to artefacts – are from Boxgrove in south-east England and are a mere 500,000 years old.
    Of course, these sites are only a sample, because the archaeological record is incomplete. In 2023, Key and his colleague Nick Ashton estimated, based on the fragmentary nature of the record, hominins may have been in northern Europe as early as 1.16 million years ago. Given the new evidence from Old Park, perhaps that date could be pushed back a bit further.
    And this is where the mystery comes in: who were these ancient humans that managed to survive in Britain’s frequently dismal climate?
    Given that Homo erectus seem to have been the first hominins to leave Africa, we might assume it was them. But there is hardly any evidence of them in Europe. There are stone tools from Korolevo in Ukraine from 1.4 million years ago, but no hominin bones. Likewise, in March I reported on the discovery of some fragmentary face bones from a cave in Northern Spain, dated to 1.1-1.4 million years ago. Their discoverers tentatively called them “Homo aff. erectus” – which means they might be H. erectus, but it’s not possible to be confident.
    Northern Spain was also home to another species called Homo antecessor. They are known from one cave, and seem to have been around between 772,000 and 949,000 years ago.
    Meanwhile, the Boxgrove hominins may have belonged to a different species called Homo heidelbergensis. Their status is a little tricky: they seem to have lived in Europe a few hundred thousand years ago, but there aren’t many remains that are unambiguously assigned to the species.
    Quite how these species are related to each other, and to us and other later groups like the Neanderthals, is honestly anyone’s guess. As a result, the earliest Britons are still hidden from us behind a thick bank of fog. Which seems appropriate.

    Topics: More

  • in

    We’re finally reading the secrets of Herculaneum’s lost library

    Joe Wilson
    Deep within a particle accelerator, theoretical physicist Giorgio Angelotti is hard at work. He sets a black cylinder on a mount, bolts it down, then runs through some safety checks before retreating from the chamber, known as “the hatch”. “You have to be sure there’s no one in the hatch before you close the door,” he says. “So no one dies.”
    That’s because he is about to blast the sample with a super-powerful beam of X-rays. You might expect the target to be some advanced new material or delicate crystal. But, at its heart, this isn’t really a physics experiment – and the object protected inside the cylinder is far from pristine. You could easily mistake it for a misshapen lump of old charcoal.
    It is in fact a priceless relic, a 2000-year-old papyrus scroll, scorched beyond recognition in the cataclysmic eruption of Vesuvius in AD 79. It is just one of the Herculaneum papyri, a cache of hundreds of scrolls that are too fragile to be opened by hand, meaning their contents have long remained a mystery. But with the help of particle accelerators, artificial intelligence and a crack team of coders assembled online, Angelotti and his team are starting to make these charred lumps talk. They could soon be uncovering entire lost works of Greek philosophy, or texts written by the earliest Christians.

    Discovered near Angelotti’s home city of Naples, Italy, in the 1750s, the scrolls come from the library of a partly excavated, 1st-century-BC villa in Herculaneum. The town, a smaller neighbour of Pompeii, was once a seaside holiday destination for rich Romans. The luxurious villa is thought to have been owned by Roman senator Lucius Calpurnius Piso Caesoninus – none other than Julius Caesar’s father-in-law.
    At least some of the 900 scrolls originally discovered were authored by the philosopher Philodemus of Gadara, one of those credited with bringing Epicurean philosophy from Greece to Italy. Classicist David Blank at the University of California, Los Angeles, explains that Philodemus had joined Piso’s entourage, a cohort whose intellectual prowess publicly signalled the senator’s importance. In turn, Piso became a patron of Philodemus’s work, ensuring that a lot of his philosophical writings, including unique early drafts, ended up in Piso’s personal collection.

    The Herculaneum papyri
    Piso and Philodemus had been dead for decades when Mount Vesuvius blew, but the library remained. As hot mud and ash engulfed Herculaneum, heat dehydrated the scrolls, not burning them, but turning them to charcoal. “The fact they are carbonised is the only reason we have them,” says papyrologist Federica Nicolardi at the University of Naples Federico II. Papyrus normally survives only in very dry climates. Other European examples rotted away centuries ago.
    The Piso collection has since dwindled, however. The papyrus layers are tightly stuck together and early attempts to unwrap them resulted in a great many being mashed, sliced, peeled and otherwise processed in ways papyrologists would rather save for potatoes. Starting in the 1750s, the scrolls’ first curator, a man named Camillo Paderni, bashed out their insides to leave just the exterior layers. “He would take the roll, cut through it… then take the butt end of his knife and pound the middle of the roll into dust,” says Blank.
    The Herculaneum papyri were turned to charcoal in the AD79 eruption of Mount Vesuvius. This one is known simply as “scroll 2”The Digital Restoration Initiative, The University of Kentucky
    A little later, Antonio Piaggio, a manuscript restorer from the Vatican Library, subjected some of the scrolls to a homemade machine. By mounting each scroll and sticking the end of the papyrus to a sheet of animal guts using glue made from fish, he was able to carefully unroll about 18 of them. These early abuses did yield several volumes’ worth of readable texts. This is how we know that at least some of the scrolls were authored by Philodemus. But most of the charcoal lumps languished unread in the National Library in Naples.
    And that was how things stood for centuries, until Brent Seales at the University of Kentucky entered the frame. Seales had lived through the early wave of digitisation, when the internet was becoming a repository for knowledge of all kinds. He wasn’t much interested in the mass scanning of ordinary books, but he became gripped by the notion that parts of this global library might be left out due to damage to the physical works. “The idea that technology could create a representation of, or even extract new information from, the damaged stuff – that really appealed to me,” he says.

    In 2000, Seales used 3D scanning and computer software to digitally uncrumple and flatten pages from fire-damaged medieval documents amassed by Sir Robert Cotton, part of the founding collection of the British Library. Some books in the trove, however, were too fragile to be opened, so couldn’t be restored using standard imaging techniques, which are based on visible light. Seales began to wonder whether the same methods we use to see inside bodies could be used to see inside books.
    The first time he fired X-rays at a book from the Cotton collection, the ink showed up much like bones do in the black and white images, he says. Immediately, he wanted to get his hands on other collections containing unopened texts, and his thoughts turned to the most famous example he knew of: the Dead Sea Scrolls. But when Seales described his plan to conservators, he was met with a “hell no”. Meanwhile, the Herculaneum scrolls entered his radar, courtesy of a tip-off from classicist Richard Janko at the University of Michigan, who had studied the contents of some of the physically opened scrolls.
    These particular papyri, though, presented some special challenges. For one thing, unlike medieval writers, who used metallic inks, Philodemus and his contemporaries often wrote in soot-based ink. That meant the challenge was to discern an ink made mostly of carbon from a scroll that was also now mostly carbon. It wasn’t exactly easy. Sure enough, Seales failed to find any ink in initial attempts with a small CT scanner in 2009.
    Herculaneum was once a holiday destination for wealthy Roman citizensCCinar/Shutterstock
    Many Hebrew and Egyptian scribes used easier-to-image metallic inks. By 2015, Seales was able to read unseen text inside a charred 4th-century-AD Hebrew scroll. And not long after, a European team including Verena Lepper at Berlin’s Egyptian Museum and Papyrus Collection used X-ray-based scans to read the words “oh Lord” inside an ancient papyrus package from the island of Elephantine on the Nile river. But scans from inside the Herculaneum scrolls still hadn’t revealed a single word.
    The digital unwrapping process wasn’t straightforward, either. The papyrus layers are so jammed together that it is tricky to peel them apart, even virtually. If the software doesn’t know the difference between one layer and the next, Nicolardi explains, “you produce something that’s actually very similar to what happens with the mechanically opened scrolls”. Pieces of text get spliced between layers, mangling the narrative.
    By then, though, AI was on the rise and machines were starting to pick out features that human couldn’t. It turned out that scans of the Herculaneum papyri were, in fact, picking up ink, but it was visible only to properly configured AI. Seales and his colleagues finally demonstrated this on unrolled Herculaneum fragments and fake scrolls inscribed with carbon ink in 2019. That was enough to help secure them use of the particle accelerator at Diamond Light Source near Oxford, UK. He used it as a supercharged CT scanner and obtained images of the insides of rolled-up, intact papyri. But still the scrolls taunted them. Seales’s student Stephen Parsons taught AI software to spot ink on these high-resolution scans, but it struggled to see anything beyond mere traces.
    That was when things changed decisively. Seales had connected with tech investor Nat Friedman, previously CEO of Github, hoping to pitch for more research funding. But Friedman had a different idea: put out a public challenge to see if anyone could write a program that could read the scrolls. Seales initially struggled with the proposal. This kind of cash-for-code challenge might be commonplace in the tech world, but for academic researchers it was unfamiliar territory – and it meant opening the scan data and Parsons’s algorithms to a wider community. “It wasn’t an obvious right move for me,” says Seales. “But we realised the only reason we were balking at the idea is that we might not get all the credit, and that was a really bad reason.”
    The Vesuvius Challenge
    And so, in March 2023, the Vesuvius Challenge was born. Any prize-winning solutions would become public, the code released for the team or others to build on, in the hope that this would speed things up a bit. And so it proved: by Christmas, the challenge’s Discord channel had more than 1000 users.
    Angelotti was one of them. Fresh from a doctorate in AI, he had barely heard of the Herculaneum scrolls, despite being born and bred in Naples. But the more he learned about them, the more they intrigued him. Between consultancy work and founding an AI start-up, he poured over digitised papyrus sheets online. As he knew nothing about papyrology, it was a steep learning curve, but it turned out to be time well spent, resulting in cash prizes including $20,000 for work to speed up image processing – and a job offer. Now the research project lead for the Vesuvius Challenge, Angelotti says reading the scrolls has become “a sort of quest to restore the cultural heritage of my homeland”.

    Meanwhile, students began to steal the limelight. In December 2023, ink-detection algorithms developed by Youssef Nader and Luke Farritor helped reveal around 2000 Greek characters. Nader taught AI to see ink by carefully training it on broken-off scroll fragments where the papyrus surface was already exposed. At the same time, Farritor was picking out the first word, porphyras (purple), from inside an unopened scroll by using a separate AI model trained on sections where a faint, but just visible, “crackle” pattern seemed to be associated with the inked parts.
    By pooling their code and working with Julian Schilliger, a student at ETH Zürich in Switzerland who had been successfully stitching digital papyrus sheets together from pixels, they were able to get better results, not to mention a nod in a peer-reviewed papyrology paper. The translated text uncovered ancient musings on food, music and pleasure, in which the author seemed to ponder the timeless question of what makes life worth living.
    Their efforts won them the Vesuvius Challenge’s $700,000 grand prize in 2023 – and, for Nader, a Mount Vesuvius cake (complete with scroll) baked by his family in Egypt. He, too, has since joined the challenge team, continuing to work on ink detection. This is far from a fully solved problem, because the ink varies from one scroll to another. In the long term, the team aims to build a fast, general ink-detection software that works for everything. “So that we can, at some point, just upload a scan of a scroll and download the text,” says Nader.
    Students Youssef Nader, Luke Farritor and Julian Schilliger produced this prize-winning image of the text inside one of the scrollsVesuvius Challenge
    The unrolling problem hasn’t been completely solved yet, either. Initially, the inked surfaces of the papyrus layers were painstakingly mapped to flattened sections of digital papyrus by humans. But, with help from community members like Schilliger, the team is now increasingly able to get AI to do the task, which should yield faster results.
    Could solutions to these problems help researchers read other ancient papyri too? “I don’t think there’s one solution and there doesn’t need to be,” says Lepper, whose work on the Elephantine papyri used more traditional, non-AI software. Each collection has its quirks, she explains. Elephantine papyri, for example, aren’t charred, but many are folded instead of rolled, which can make unwrapping them more complex.
    Revealing hidden text in ancient manuscripts is no trivial task. But for the Vesuvius Challenge, at least, progress continues to accelerate “as a direct result of the contest”, says Seales, his initial reservations now seemingly forgotten. Both Seales and Angelotti are optimistic that there will come a time when it is as easy as pressing a button and letting the software do the rest. Right now, though, there are still plenty of scrolls left to scan, meaning more time spent kicking around in the control rooms of particle accelerators.
    When New Scientist spoke to Angelotti in mid-July, he had just finished scanning more than 30 Herculaneum scrolls at Diamond Light Source and the European Synchrotron Radiation Facility, the particle accelerator in Grenoble, France, with “the hatch”. He had also been carrying out crucial experimental work, the early results of which suggest that scanning at a higher resolution may help AI see features common to ink across all the scrolls. If so, the whole collection could become imminently readable. The only problem, Angelotti groans, is that it would mean the scans take about six times longer than usual – so more hours to kill in a control room.

    Meanwhile, the Vesuvius Challenge team has been preparing to release more data to its community of coders, and successes have continued to mount up. In May 2025, computer science graduates Marcel Roth and Micha Nowak at the University of Würzburg in Germany adapted medical-imaging software to read the first-ever title from within the scrolls, winning themselves $60,000. Roth says the pair got hooked on the contest, at one point skipping university for nearly three months.
    And the title? Philodemus, On Vices. “We were all very happy to see it was really Philodemus,” says Angelotti, because it confirmed the AI wasn’t hallucinating. It is unlikely to be the last we hear from Philodemus, either, because most of the scrolls read so far seem to come from the philosophy section of Piso’s vast library.
    Back in the Bay of Naples, there could be many more scrolls still to excavate. After all, part of the villa remains unexplored, obstructed by 20 metres of volcano spew and messy local politics. The New Testament puts Paul the Apostle on the scene around AD 50, before his execution about a decade and a half later. Could his movements have been recorded before Vesuvius’s eruption? Perhaps, “if the Herculaneum library had a current events section,” quips Seales. Until recently, of course, there wouldn’t have been much point in looking for such long-lost treasures, since we couldn’t unlock their contents. But now that we can, there’s a good argument for getting out the shovels.

    Topics: More

  • in

    ‘Pregnancy test’ for skeletons could help reveal ancient mothers

    The skeleton of a woman cradling a baby in her left arm, buried at an Anglo-Saxon cemetery in Scremby, UKDr Hugh Willmott, University of Sheffield
    Scientists are homing in on a pregnancy test for women who lived hundreds or even thousands of years ago.
    For the first time, researchers have detected levels of oestrogen, progesterone and testosterone in the skeletal remains of women from the 1st to the 19th century AD – some of whom were buried with fetuses. The findings show that ancient bones and teeth preserve clear traces of certain sex hormones, which could help identify which individuals in archaeological sites were pregnant or had just given birth at their time of death, says Aimée Barlow at the University of Sheffield in the UK.

    “The physiological and emotional experience of pregnancy and pregnancy loss and childbirth are very profound for women, but so far, they’ve largely remained invisible in the archaeological record,” she says. “This method has the potential to revolutionise the way we study reproductive histories of past populations. I’m thrilled, to be honest.”
    Pregnancy is difficult to see in ancient individuals, especially if the fetus didn’t have a visible skeleton yet. Even fetuses in the second and third trimester can be overlooked since their bones can resemble those of the mother’s hands – which are often placed over their abdomens for burial.
    Modern pregnancy tests measure levels of hormones like hCG in blood or urine. But hCG quickly breaks down, leaving little trace of its presence in the body.

    Progesterone, oestrogen and testosterone, however, can linger in tissues longer. Recent research shows that these steroid hormones can be found in people’s blood, saliva and hair – even in long-buried strands from Egyptian mummies.
    To assess the potential for detecting ancient pregnancies, Barlow and her colleagues sampled rib fragments and one neck bone from two men and seven women buried in four English cemeteries. They also sampled the people’s teeth, along with those of a third man.
    Two of the women had confirmed fetal remains in their abdomens, and two others were buried with newborn babies. The sexes of the other people had been determined by DNA analysis.
    The team ground each sample into a powder and used chemicals and other techniques to isolate any steroid hormones. Laboratory testing then determined how much oestrogen, progesterone and testosterone each of the 74 samples contained.
    Oestrogen only showed up in four samples, with no clear pattern – possibly because it breaks down quicker than progesterone and testosterone, and might not store well in tissues.
    Progesterone, however, showed up especially high in the vertebra of a young woman who died carrying a full-term fetus between the 11th and 14th centuries. The other third-trimester woman, buried in the 18th or 19th century, had elevated progesterone in her rib. Moderate progesterone levels also appeared in the dental plaque of the two women buried with babies in the 5th or 6th century.

    Notably, these four women had no traces of testosterone whatsoever in their bones, nor in any part of their teeth – although one buried with a premature baby had a small amount in her plaque. By contrast, the three women not associated with fetuses or infants, who were buried in an 8th-to-12th century cemetery and a Roman-era grave, had testosterone in their ribs and in all layers of their teeth.
    Testosterone at low levels plays important roles in women’s health, so its presence in those samples isn’t surprising, says Barlow. “But perhaps the absence of testosterone indicates a recent or current pregnancy at the time of death,” she says.
    “This is an exciting and unexpected intersection of archaeology with hormone science,” says Alexander Comninos at Imperial College London. “These techniques could be used to detect pregnancy in skeletal remains more reliably and so give us more accurate insights into ancient pregnancy.”
    Even so, while the results are promising, further research must iron out the details, says Barlow. Men’s bones and inner teeth often showed moderate levels of progesterone, for example, for reasons yet to be understood, she says. “The interpretations are very cautious at the moment.”

    Walking Hadrian’s Wall and Roman innovation: England

    Follow in the footsteps of the Romans on this immersive walking tour along Hadrian’s Wall, one of Britain’s most iconic, ancient landmarks and a UNESCO World Heritage Site.

    Topics:archaeology/pregnancy More

  • in

    Evolution of intelligence in our ancestors may have come at a cost

    A model of Homo heidelbergensis, which might have been the direct ancestor of Homo sapiensWHPics / Alamy
    A timeline of genetic changes in millions of years of human evolution shows that variants linked to higher intelligence appeared most rapidly around 500,000 years ago, and were closely followed by mutations that made us more prone to mental illness.
    The findings suggest a “trade-off” in brain evolution between intelligence and psychiatric issues, says Ilan Libedinsky at the Center for Neurogenomics and Cognitive Research in Amsterdam, the Netherlands.

    “Mutations related to psychiatric disorders apparently involve part of the genome that also involves intelligence. So there’s an overlap there,” says Libedinsky. “[The advances in cognition] may have come at the price of making our brains more vulnerable to mental disorders.”
    Humans split from our closest living relatives – chimpanzees and bonobos – more than 5 million years ago, and our brains have tripled in size since then, with the fastest growth over the past 2 million years.
    While fossils allow scientists to study such changes in brain size and shape, they can’t reveal much about what those brains were capable of doing.

    Recently, however, genome-wide association studies have examined many people’s DNA to determine which mutations are correlated with traits like intelligence, brain size, height and various kinds of illnesses. Meanwhile, other teams have been analysing specific aspects of mutations that hint at their age, providing estimates of when those variants first appeared.
    Libedinsky and his colleagues pulled both methods together for the first time, to create an evolutionary timeline of humans’ brain-related genetics.
    “We don’t have any trace of the cognition of our ancestors with regard to their behaviour and their mental issues – you can’t find those in the palaeontological records,” he says. “We wanted to see if we could build some sort of ‘time machine’ with our genome to figure this out.”
    The team investigated the evolutionary origins of 33,000 genetic variants found in modern humans that have been linked to a wide variety of traits, including brain structure and various measures of cognition and psychiatric conditions, as well as physical and health-related features like eye shape and cancer. Most of these genetic mutations only show weak associations with a trait, says Libedinsky. “The links can be useful starting points, but they’re far from deterministic.”
    They found that most of these genetic variants emerged between about 3 million and 4000 years ago, with an explosion of new ones in the past 60,000 years — around the time Homo sapiens made a major migration out of Africa.

    Variants linked to more advanced cognitive abilities evolved relatively recently compared with those for other traits, says Libedinsky. For example, those related to fluid intelligence – essentially logical problem-solving in new situations – appeared about 500,000 years ago on average. That’s about 90,000 years later than variants associated with cancer, and nearly 300,000 years after those related to metabolic functions and disorders. Those intelligence-linked variants were closely followed by variants linked to psychiatric problems, around 475,000 years ago on average.
    That trend repeated itself starting around 300,000 years ago, when many of the variants influencing the shape of the cortex – the brain’s outer layer responsible for higher-order cognition – appeared. In the past 50,000 years, numerous variants tied to language evolved, and these were closely followed by variants linked to alcohol addiction and depression.
    “Mutations related to the very basic structure of the nervous system come a little bit before the mutations for cognition or intelligence, which makes sense, since you have to develop your brain first for higher intelligence to emerge,” says Libedinsky. “And then the mutation for intelligence comes before psychiatric disorders, which also makes sense. First you need to be intelligent and have language before you can have dysfunctions on these capabilities.”

    The dates also line up with evidence suggesting that Homo sapiens acquired some of the variants linked to alcohol consumption and mood disorders from interbreeding events with Neanderthals, he adds.
    Why evolution hasn’t weeded out the variants that predispose for psychiatric conditions isn’t clear, but it might be because the effects are modest and may confer advantages in some contexts, says Libedinsky.
    “This kind of work is exciting because it allows scientists to revisit longstanding questions in human evolution, testing hypotheses in a concrete way using real-world data gleaned from our genomes,” says Simon Fisher at the Max Planck Institute for Psycholinguistics in Nijmegen, the Netherlands.
    Even so, this kind of study can only examine genetic sites that still vary among living humans – meaning it misses older, now-universal changes that may have been key to our evolution, Fisher adds. Developing tools to probe “fixed” regions could offer deeper insight into what truly makes us human, he says.

    Topics:genetics/human evolution More