More stories

  • in

    European wine grapes have their genetic roots in western Asia

    We used to think that European wine grapes were cultivated locally, independently of grape domestication in western Asia, but grape genetics suggests otherwise

    Humans

    21 December 2021

    By Carissa Wong
    Red grapes ready to be harvested in a vineyardalika/Shutterstock
    Grapes used to make common European wines may have originated from grapevines that were first domesticated in the South Caucasus region of western Asia. As these domesticated grapes dispersed westwards during the Greek and Roman times, they interbred with local European wild populations, which helped the wine grapes adapt to different European climates.
    The origins of grapes (Vitis vinifera) that are used in Europe and elsewhere to produce wines such as Merlot, Chardonnay and Pinot Noir have long been debated.
    It has been proposed that European wine grapes arose from the cultivation of wild European populations (V. vinifera subspecies sylvestris), independently of the original domestication of grapes in western Asia around 7000 years ago.Advertisement
    But a genetic analysis carried out by Gabriele Di Gaspero at the Institute of Applied Genomics in Udine, Italy, and his colleagues suggests that European wine grapes actually originated from domesticated grapes (V. vinifera subspecies sativa) that were initially grown for consumption as fresh fruit in western Asia.
    The team sequenced the genomes of 204 wild and cultivated grape varieties – to cover the range of genetic diversity in cultivated grapes – and compared how similar their genetic sequences were to one another.
    This revealed that as western Asian table grapes spread westwards across the Mediterranean and further inland into Europe, they interbred with wild European grape populations that grew nearby.

    “The wild plants grew close to vineyards and interbred – this was unintentional. But the results of the breeding created adaptive traits that were likely selected by humans intentionally,” says Di Gaspero. “By bringing together this genetic evidence and existing historical evidence, the introductions in southern Europe and inland likely occurred in Greek and Roman times, although we don’t know more specific dates.”
    By modelling how the ancestry of the grapes in different regions of Europe related to aspects of the local climate such as temperature and precipitation, the team discovered that European wild grapes probably contributed traits that enabled the ancestral grape vines to adapt to different regions as they moved westwards from Asia.
    The team also found evidence of the effect that domestication had on grape genetics.
    In wild grape varieties, a larger seed makes a larger berry because grape seeds produce a growth hormone called ethylene. But for human consumption, a larger berry-to-seed ratio is desirable. The team found that an enzyme not found in the berries of wild varieties was present in the berries of domesticated varieties. In other plants, the enzyme is known to help berries grow in response to ethylene, which suggests it does the same in grapes.
    Understanding which genes encode favourable traits in grapes can allow us grow better grape crops, says Di Gaspero.
    Journal reference: Nature Communications, DOI: 10.1038/s41467-021-27487-y

    More on these topics: More

  • in

    Higher US welfare benefits seem to protect children's brains

    The size of a child’s hippocampus can be limited by stress, and US state welfare schemes that give families $500 a month or more are linked to a reduction in this association

    Humans

    20 December 2021

    By Jason Arunn Murugesu
    Illustration of the hippocampus in a child’s brainScience Photo Library / Alamy
    Higher payments from US welfare schemes can reduce the impact that living in a low-income household has on the size of a crucial region of a child’s brain.
    David Weissman at Harvard University in Massachusetts, and his colleagues analysed images of the brains of more than 11,000 children aged 9 and 10 in the US, looking specifically at the size of each child’s hippocampus.
    “The hippocampus is a brain region involved in learning and memory,” says Weissman. Its development is believed to be impaired by excess stress, which can be caused by growing up in poverty, he says.Advertisement
    “Prior studies show that kids with small hippocampal volumes are more likely to develop internalising problems [such as anxiety and social withdrawal] and develop depression,” he says.
    The children came from 17 states, and while they aren’t wholly representative of the US population they are “pretty close”, according to Weissman. The data set is slightly skewed towards more urban areas because the imaging can only be done in places that have available neuroimaging equipment and related expertise.
    Weissman and his team looked at whether a state had expanded Medicaid, a federally subsidised healthcare scheme, in 2017. That year, states had to choose whether to begin covering a portion of the services that were previously fully covered by the federal government. Just over 7500 of the children involved lived in states that expanded Medicaid.
    Then the researchers analysed the average amount of welfare benefits people in each state received under various anti-poverty schemes. The higher this total, the more generous they considered the state’s benefits system. “It’s a rough estimate, but it works,” says Weissman.

    Combining this with the brain imagery revealed that children in families that received fewer welfare benefits from their state had a smaller hippocampus than average. This link was stronger in states with a high cost of living.
    The team found there was a 37 per cent reduction in the association between lower family incomes and a smaller hippocampus in states that gave each family receiving welfare payments on average $500 a month or more, compared with those that gave less than $500 a month.
    The link between receiving better welfare payments and a smaller hippocampus was also reduced by 19 per cent in states that had expanded Medicaid compared with those that hadn’t.
    Weissman says the results aren’t surprising, but it is still “shocking” to see how major government policy decisions have an actual effect on the brain.
    “If your goal is to have a fairer society where this doesn’t happen, then you should be pushing for policies that give more cash benefits to poorer families,” he says.
    “I think that this finding is tremendously important,” says Jane Barlow at the University of Oxford. “The research [in this field] now clearly shows social adversity can become biologically embedded during the prenatal period and early years of a child’s life as a result of the way in which they impact the neurological development of the child.”
    Reference: PsyArXiv, DOI: 10.31234/osf.io/8nhej

    More on these topics: More

  • in

    People occupied the Faroe Islands 300 years earlier than we thought

    By Chen Ly
    The Faroe Islandsdataichi – Simon Dubreuil/Getty Images
    People arrived on the Faroe Islands – a North Atlantic archipelago between Iceland, Norway and the British Isles – earlier than we thought, predating the arrival of Norse Vikings by about 300 years.
    The earliest direct evidence of human settlement on the Faroe Islands dates back to the arrival of the Vikings in around AD 800. But charred barley grains and cereal grain pollen on the islands dating back to around AD 500 indirectly hint that farming must have existed on the islands pre-Viking.
    Now, William D’Andrea at … More

  • in

    Languages could go extinct at a rate of one per month this century

    By Christa Lesté-Lasserre
    Researchers Lindell Bromham and Xia Hua analyse data on the Gurindji languageJamie Kidston/ANU
    Denser road networks, higher levels of education and even climate change are just a few of the factors that could lead to the loss of more than 20 per cent of the world’s 7000 languages by the end of the century – equivalent to one language vanishing per month.
    Based on a new model similar to those used for predicting species loss, a team of biologists, mathematicians and linguists led by Lindell Bromham at Australian National University in Canberra has determined that, without effective conservation, language loss will increase five-fold by 2100.
    “This is a frightening statistic,” says Bromham, adding that her team’s estimates are “conservative”.Advertisement
    “Every time a language is lost, we lose so much,” she says. “We lose a rich source of cultural information; we lose a unique and beautiful expression of human creativity.”
    Current language loss estimates vary considerably, with some predicting that up to 90 per cent of languages might no longer be spoken at the start of the next century.
    Bromham, an evolutionary biologist, and her colleagues suspected that by borrowing modelling techniques from studies on biodiversity loss, they might be able to capture a more statistically sound view of language diversity loss.

    They analysed 6511 languages that are still spoken or have ceased to be spoken – known as “sleeping” languages. They compared the languages’ endangerment status – based on which generations continue to learn and speak the language – with 51 variables related to the likes of legal recognition of the language, demographics, education policies, environmental features and socioeconomic indicators.
    They found that having other languages nearby isn’t a risk factor for language loss. In fact, says Bromham, many communities become multilingual when in proximity to other languages.
    On the other hand, their study suggested that being geographically isolated – living in a valley among high mountains on an island, for instance – doesn’t make people more likely to hold on to their language.
    Denser road networks were associated with higher levels of language loss on a global scale, says Bromham. That could be attributed to the fact that roads increase the level of commuting between rural areas and larger towns, leading to a greater influence of commerce and centralised government and the languages associated with them.

    Higher levels of education were also linked to greater loss of local language across the globe, says Bromham.
    “This is a very worrying result,” she says. “But I want to emphasise that we are not saying education is bad or that kids shouldn’t go to school. Rather, we’re saying that we need to make sure bilingualism is supported, so that children get the benefit of education without the cost to their own Indigenous language competency.”
    Marybeth Nevins, a linguist and anthropologist at Middlebury College in Vermont who wasn’t involved in the study, finds it “both troubling and understandable that schooling would predict endangerment”.
    “Schooling establishes a whole new set of practices designed to orient the student to the historically encroaching institutions,” says Nevins.
    While 20th century schools were based on single language learning, modern digital technology allows for multilingualism in government institutions, including schools, she says. “With adequate Indigenous language resources, [schooling] need not lead to endangerment.”
    The researchers also detected risk factors on a regional level, says Bromham. For example, larger pasture areas were associated with more language loss in parts of Africa, while in Europe, increased temperature seasonality was linked to greater endangerment, reflecting “language erosion” in parts of Scandinavia. More studies are needed to understand these connections, however, she adds.
    Holding onto local languages is critical, Nevins says, as it represents a way to maintain the history and culture of Indigenous people who were “forcibly incorporated into the capitalist world system”.
    “Language is a kind of proof of ancestral life, a powerful resource against political erasure, a means of reclamation,” she says. “For all of us, Indigenous languages are indispensable to understanding the nature, diversity and historic spread of human beings on our shared planet.”
    Journal reference: Nature Ecology and Evolution, DOI: 10.1038/s41559-021-01604-y

    More on these topics: More

  • in

    Neanderthals may have cleared a European forest with fire or tools

    When Neanderthals lived at a site called Neumark-Nord in Germany, the region had far fewer trees than surrounding areas, suggesting they may have cleared the forest on purpose

    Humans

    15 December 2021

    A lakeside archaeological site at Neumark-Nord in GermanyWil Roebroeks, Leiden University
    Neanderthals may have reshaped part of the European landscape 125,000 years ago, clearing trees to create a more open environment in which to live. It is the oldest evidence of a hominin having landscape-level effects.
    The indications come from an archaeological site called Neumark-Nord in Germany. About 130,000 years ago, great ice sheets retreated, making Neumark-Nord liveable until the ice advanced again 115,000 years ago. During that 15,000-year warm spell, Neanderthals moved into the area, perhaps attracted by a series of lakes in the region.
    Neanderthals lived throughout Europe for hundreds of thousands of years, so it seems likely that they had impacts on the environment, says Katharine MacDonald at Leiden University in the Netherlands. “We knew that they were effective hunters, so they were clearly occupying a niche where they could compete with the other carnivores around quite effectively.”Advertisement
    MacDonald and her colleagues compiled data from the warm period on the different plant species preserved at the site, as well as charcoal deposits left by fires. Compared with neighbouring sites where Neanderthals didn’t live, the team found a decrease in the tree cover. While neighbouring areas were densely forested, Neumark-Nord “would have been a lot more light and open, and probably more varied as well”, says MacDonald.
    Modern humans have altered landscapes in similar ways, but the evidence is largely limited to the past 50,000 years. “It’s the first case where it’s been shown for Neanderthals,” says MacDonald.

    It isn’t clear how this happened. There is a peak in charcoal around when Neanderthals arrived, so “it’s really tempting to imagine that that might have been Neanderthals burning the vegetation”, says MacDonald. But she says the dates can’t be resolved precisely enough, so it could be that a natural wildfire opened up the vegetation and Neanderthals arrived in the aftermath.
    We also know that Neanderthals made advanced stone tools and that they used them to chop wood. “But I don’t know that there’s any direct evidence for actually cutting down a tree,” says MacDonald.
    Compared with other Neanderthal sites, Neumark-Nord seems to have been settled relatively permanently, perhaps even all year round. Neanderthals aren’t known for doing that, says MacDonald. “They are often seen as being quite mobile, and this is quite an unusual site.”
    It may be that the open landscape, coupled with the lakes, attracted a lot of large animals for them to hunt – so they had no need to move, she says.
    Journal reference: Science Advances, DOI: 10.1126/sciadv.abj5567
    Sign up to Our Human Story, a free monthly newsletter on the revolution in archaeology and human evolution

    More on these topics: More

  • in

    In 2021, we made real progress in fighting covid-19 and climate change

    Reinhard Dirscherl/ullstein bild via Getty Images“A YEAR of tackling great challenges.” In the title of our review of the year, “tackling” is the operative word. Two great challenges have dominated the past 12 months: the ongoing covid-19 pandemic, and efforts to address climate change, as embodied by the COP26 summit held in Glasgow, UK, in November. Both have seen significant progress – but only the most irrational optimist could claim that what we have achieved so far amounts to solutions.
    Our retrospective leader of 2020 was devoted to the promise that vaccines might bring a swift end to covid-19. At the time, more than 70 … More

  • in

    How climate change is shaking up the hops that give beer its flavour

    Hop plants are largely what distinguish your dark ales from your refreshing pales, and each has its own “terroir”. With changing weather affecting how and where they grow, what does the future hold for brewing and beer?

    Humans

    15 December 2021

    By Chris Simms
    Wicked weed: freshly harvested hop flowersJean/Stockimo/Alamy
    WATER, malted barley and hops. It is the classic recipe for the world’s favourite intoxicant. According to a law declared in 1516 in the German state of Bavaria, a place that likes to see itself as beer’s spiritual home, those are the only three ingredients it may contain – the yeast that converts the sugars in the barley to alcohol being out of sight and out of mind back then.
    Today’s craft beer revolution takes such strictures less seriously, with new and exotic brews catering for all manner of tastes. But one ingredient remains a constant – indeed the fulcrum – of good beer. Hops give beer the bitterness that counterbalances the sickly sweetness of the fermenting grain and imparts subtle flavour tones that distinguish one brew from another, all while acting as a natural preservative.
    That is reason enough to declare the hop one of the world’s most important, if often overlooked, plants. Yet trouble is brewing, with a perfect storm of changing tastes and changing weather contriving to shake up its cultivation. The question frothing on many a lip now is whether an ale and hearty future for the hop can be assured.
    Hops weren’t always so universally beloved. In England, they were once dubbed the “wicked weed”, and traditional ales were brewed without them. It is a myth that Henry VI once tried to ban them, although the city of Norwich did in 1471, as it tried to defend the purity of yeoman English ale in the face of perfidious hopped continental imports. Before … More

  • in

    2021 in review: Learning the pros and cons of working from home

    The covid-19 pandemic has forced millions to participate in one of the biggest social experiments of our time. Nearly two years in, it’s time to take stock: what happens when workers abandon offices?

    Humans

    15 December 2021

    By Alice Klein
    Working from home has led to rises in productivity for someExperience Interiors/iStock
    THE covid-19 pandemic has forced millions of us to participate in one of the biggest social experiments of our time: what would happen if office workers largely abandoned their workplaces and began working from home? More than 18 months in, it is time to take stock.
    One thing seems clear: more people working remotely has brought some benefits for the environment. With less commuter traffic, wildlife has been able to reclaim urban spaces while people have been tapping away at their home keyboards.
    But what about the benefits to people? The major perks of home working include people having more flexibility to mould jobs around their family, exercise and leisure time, being able to wear whatever they like, controlling their own heating and lighting and not having to commute. The lack of commuting may be the biggest bonus, since surveys show that workers typically rate their commute as the worst part of their day, unless they walk or cycle.Advertisement
    Many people have also been able to get more done while working remotely, possibly due to fewer distractions. A survey by Boston Consulting Group of 1500 managers at large European companies found that more than half had seen productivity levels rise as their employees shifted to remote work during the pandemic.
    “There used to be a lot of resistance to working from home because managers thought employees would just goof off and watch Netflix, but there’s a lot more trust now,” says Sue Williamson at the University of New South Wales in Canberra, Australia.
    However, the experiment hasn’t been all positive. Many people forced to work from home have reported feeling isolated and finding it harder to switch off due to the blurred boundary between work and home life.
    “Surveys show that workers typically rate their commute as the worst part of their day”
    Many managers have also reported declines in innovation, which is probably because “it’s hard to get those serendipitous conversations between people that spark ideas” when everyone is physically separated, says Anne Bardoel at Swinburne University of Technology in Melbourne, Australia.
    Then there is “Zoom fatigue”, the drained feeling that often accompanies virtual meetings, even though they tend to be shorter than in-person ones. This may be because people have a stronger sense of being on show while on screen and feel more pressure to present well, says Allison Gabriel at the University of Arizona.
    As vaccines help to control covid-19, many organisations are hoping to reap the best of both worlds by letting employees work from home on some days and travel to the office on others. The coming months and years will undoubtedly involve trial and error as companies and employees settle on the optimum mix of office and work-from-home days. But one thing seems certain: now that office workers have been given a chance to really think about how they want their work lives to look, there is no turning back.
    “It is this opportunity to reset and rethink how we actually work, and I think that’s a very positive thing,” says Bardoel.

    2021 in review
    This was a year of tackling great challenges, from the covid-19 pandemic to climate change. But 2021 was also rich in scientific discoveries and major advances.

    More on these topics: More