More stories

  • in

    Mobile phone data helps track pathogen spread and evolution of superbugs

    A new way to map the spread and evolution of pathogens, and their responses to vaccines and antibiotics, will provide key insights to help predict and prevent future outbreaks. The approach combines a pathogen’s genomic data with human travel patterns, taken from anonymised mobile phone data.
    Researchers from the Wellcome Sanger Institute, University of the Witwatersrand and National Institute for Communicable Diseases in South Africa, the University of Cambridge, and partners across the Global Pneumococcal Sequencing project1, integrated genomic data from nearly 7,000 Streptococcus pneumoniae (pneumococcus)samples collected in South Africa with detailed human mobility data2. This enabled them to see how these bacteria, which cause pneumonia and meningitis3, move between regions and evolve over time.
    The findings, published today (3 July) in Nature, suggest initial reductions in antibiotic resistance linked to the 2009 pneumococcal vaccine may be only temporary, as non-targeted strains resistant to antibiotics such as penicillin gained a 68 per cent competitive advantage.
    This is the first time researchers have been able to precisely quantify the fitness — their ability to survive and reproduce — of different pneumococcal strains. The insight could inform vaccine development to target the most harmful strains, and may be applicable to other pathogens.
    Many infectious diseases such as tuberculosis, HIV, and COVID-19 exist in multiple strains or variants circulating simultaneously, making them difficult to study. Pneumococcus, a bacterium that is a leading cause of pneumonia, meningitis, and sepsis worldwide4, is a prime example with over 100 types and 900 genetic strains globally. Pneumonia alone kills around 740,000 children under the age of five each year5, making it the single largest infectious cause of death in children.
    Pneumococcal diversity hampers control efforts, as vaccines targeting major strains leave room for others to fill the vacant niches. How these bacteria spread, how vaccines affect their survival, and their resistance to antibiotics remains poorly understood.
    In this new study, researchers analysed genome sequences from 6,910 pneumococcus samples collected in South Africa between 2000 and 2014 to track the distribution of different strains over time. They combined these data with anonymised records of human travel patterns collected by Meta2.

    The team developed computational models which revealed pneumococcal strains take around 50 years to fully mix throughout South Africa’s population, largely due to localised human movement patterns.
    They found that while introduction of a pneumococcal vaccine against certain types of these bacteria in 2009 reduced the number of cases caused by those types6, it also made other non-targeted strains of these bacteria gain a 68 per cent competitive advantage, with an increasing proportion of them becoming resistant to antibiotics such as penicillin. This suggests that the vaccine-linked protection against antibiotic resistance is short-lived.
    Dr Sophie Belman, first author of the study, former PhD student at the Wellcome Sanger Institute and now a Schmidt Science Fellow at the Barcelona Supercomputing Centre, Spain, said: “While we found that pneumococcal bacteria generally spread slowly, the use of vaccines and antimicrobials can quickly and significantly change these dynamics. Our models could be applied to other regions and pathogens to better understand and predict pathogen spread, in the context of drug resistance and vaccine effectiveness.”
    Dr Anne von Gottberg, author of the study at National Institute for Communicable Diseases, Johannesburg, South Africa, said: “Despite vaccination efforts, pneumonia remains one of the leading causes of death for children under five in South Africa. With continuous genomic surveillance and adaptable vaccination strategies to counter the remarkable adaptability of these pathogens, we may be able to better target interventions to limit the burden of disease.”
    Professor Stephen Bentley, senior author of the study at the Wellcome Sanger Institute, said: “The pneumococcus’s diversity has obscured our view on how any given strain spreads from one region to the next. This integrated approach using bacterial genome and human travel data finally allows us to cut through that complexity, uncovering hidden migratory paths in high-definition for the first time. This could allow researchers to anticipate where emerging high-risk strains may take hold next, putting us a step ahead of potential outbreaks.”
    Notes Partners from the Global Pneumococcal Sequencing project can be found here: https://www.pneumogen.net/gps/ The human mobility data used in this study are Meta Data for Good baseline data, released during the 2020 SARS-CoV-2 pandemic. These data rely on personal consent for location sharing, and Data for Good ensures individual privacy by preventing re-identification in aggregated datasets. For more information on pneumococcal disease, visit: https://www.cdc.gov/pneumococcal/about/index.html https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5666185/ https://www.who.int/news-room/fact-sheets/detail/pneumonia Before these vaccines, 85 per cent of pneumococcal strains were those targeted by the vaccines. By 2014, this dropped to 33.2 per cent. This change was consistent across all nine provinces in South Africa. More

  • in

    Why this year’s climate conditions helped Hurricane Beryl smash records

    Hurricane Beryl, the Atlantic Ocean’s first hurricane in 2024, began roaring across the Caribbean in late June, wreaking devastation on Grenada and other Windward Islands as it grew in power. It’s now swirling on like a buzzsaw toward Jamaica and Mexico’s Yucatán peninsula.

    Beryl is a record-breaking storm, commanding attention in a year already filled with record-breaking climate events (SN: 6/21/24; SN: 4/30/24).

    On June 30, the storm became the earliest Atlantic hurricane on record to achieve Category 4 status. Just a day later, it had intensified further, becoming the earliest Atlantic storm on record to achieve Category 5 status, with sustained winds of about 270 kilometers per hour, according to the U.S. National Hurricane Center in Miami. (As of late July 2, the storm has weakened slightly but remains a powerful Category 4 ahead of making landfall in Jamaica.) More

  • in

    Precise and less expensive 3D printing of complex, high-resolution structures

    Researchers have developed a new two-photon polymerization technique that uses two lasers to 3D print complex high-resolution structures. The advance could make this 3D printing process less expensive, helping it find wider use in a variety of applications.
    Two-photon polymerization is an advanced additive manufacturing technique that traditionally uses femtosecond lasers to polymerize materials in a precise, 3D manner. Although this process works well for making high-resolution microstructures, it isn’t widely used in manufacturing because femtosecond lasers are expensive and increase the cost of printing parts.
    “We combined a relatively low-cost laser emitting visible light with a femtosecond laser emitting infrared pulses to reduce the power requirement of the femtosecond laser,” said research team leader Xianfan Xu from Purdue University. “In this way, with a given femtosecond laser power, the printing throughput can be increased, leading to a lower cost for printing individual parts.”
    In the Optica Publishing Group journal Optics Express, the researchers show that the two-laser approach reduces the femtosecond laser 3D printing power needed by as much as 50% compared to using a femtosecond laser alone.
    “3D printing with high resolution has many applications, including 3D electronics devices, micro-robots for the biomedical field and 3D structures or scaffolds for tissue engineering,” said Xu. “Our novel, 3D printing approach can be readily implemented in many existing femtosecond laser 3D printing systems.”
    Finding the right laser balance
    The new work is part of the research team’s effort to continuously improve the printing speed and reduce the printing cost for two-photon polymerization, which uses the phenomenon of two-photo absorption to precisely cure, or solidify, a photosensitive material.

    “In a conventional two-photon polymerization printing process, the femtosecond laser is first used to initiate a photochemical process that reduces the inhibition species in the material before printing starts,” said Xu. “We used a low-cost laser for this purpose instead.”
    The new approach combines single-photon absorption from a 532 nm nanosecond laser with two-photon absorption from an 800-nm femtosecond laser. For this to work, the researchers had to find the right balance between the printing and inhibition caused by the two lasers. They did this by creating a new mathematical model to help them understand the photochemical processes involved and to compute the combined effect of two-photon and single-photon excitation processes. They also used the model to identify the dominant processes controlling how much the femtosecond laser’s power could be reduced while still achieving desirable printing results.
    Printing detailed structures
    After finetuning the new approach, they used it to print various 2D and 3D structures using reduced femtosecond laser power. These included detailed woodpiles measuring just 25 × 25 × 10 μm as well as a micron-scale buckyball, chiral structure and trefoil knot. Experimental results showed that the new method reduced the power required from the femtosecond laser by up to 80 percent for 2D structures and up to around 50 percent for 3D structures.
    “This new printing approach could impact manufacturing technologies, influencing the development of devices across consumer electronics and healthcare sectors both now and in the future,” said Xu. The researchers are now working to further improve the printing speed and reduce the cost of 3D printing. More

  • in

    Optoelectronics gain spin control from chiral perovskites and III-V semiconductors

    A research effort led by scientists at the U.S. Department of Energy’s (DOE’s) National Renewable Energy Laboratory (NREL) has made advances that could enable a broader range of currently unimagined optoelectronic devices.
    The researchers, whose previous innovation included incorporating a perovskite layer that allowed the creation of a new type of polarized light-emitting diode (LED) that emits spin-controlled photons at room temperature without the use of magnetic fields or ferromagnetic contacts, now have gone a step further by integrating a III-V semiconductor optoelectronic structure with a chiral halide perovskite semiconductor. That is, they transformed an existing commercialized LED into one that also controls the spin of electrons. The results provide a pathway toward transforming modern optoelectronics, a field that relies on the control of light and encompasses LEDs, solar cells, and telecommunications lasers, among other devices.
    “It’s up to one’s imagination where this might go or where this might end up,” said Matthew Beard, a senior research fellow at NREL and coauthor of the newly published Nature article, “Room temperature spin injection across a chiral-perovskite/III-V interface.”
    Beard also serves as director of the Center for Hybrid Organic Inorganic Semiconductors for Energy (CHOISE), an Energy Frontier Research Center funded by the Office of Science Basic Energy Sciences within DOE. The reported research was funded by CHOISE and relied on a broad range of scientific expertise drawn from NREL, the Colorado School of Mines, University of Utah, University of Colorado Boulder, and the Universite de Lorraine in France.
    The goal of CHOISE is to understand control over the interconversion of charge, spin, and light using carefully designed chemical systems. In particular, the work focuses on control over the electron spin that can be either “up” or “down.” Most current-day optoelectronic devices rely on the interconversion between charge and light. However, spin is another property of electrons, and control over the spin could enable a wide plethora of new effects and functionality. The researchers published a paper in 2021 in which they reported how by using two different perovskite layers they were able to control the spin by creating a filter that blocks electrons “spinning” in the wrong direction.
    They hypothesized at the time that advancements could be made in optoelectronics if they could successfully incorporate the two semiconductors, and then went on to do just that. The breakthroughs made, which include eliminating the need for subzero Celsius temperatures, can be used to increase data processing speeds and decrease the amount of power needed.
    “Most current-day technologies are all based on controlling charge,” Beard said. “Most people just forget about the electron spin, but spin is very important, and it’s also another parameter that one can control and utilize.”
    Manipulating the spin of electrons in a semiconductor has previously required the use of ferromagnetic contacts under an applied magnetic field. Using chiral perovskites, the researchers were able to transform an LED to one that emits polarized light at room temperature and without a magnetic field. Chirality refers to the material’s structure that cannot be superimposed on its mirror image, such as a hand. For example, a “left-handed” oriented chiral system may allow transport of electrons with “up” spins but block electrons with “down” spins, and vice versa. The spin of the electron is then converted to the “spin,” or polarization, of the emitted light. The degree of polarization, which measures the intensity of light that is polarized in one direction, reached about 2.6% in the previous research. The addition of the III-V semiconductor — which is made of materials in the third and fifth columns of the periodic table — boosted the polarization to about 15%. The degree of polarization serves as a direct measure of spin accumulation in the LED.
    “This work is particularly exciting to me, as it combines spin functionality with a traditional LED platform,” said the first author of the work, Matthew Hautzinger. “You can buy an LED analogous to what we used for 14 cents, but with the chiral perovskite incorporated, we’ve transformed an already robust (and well understood) technology into a futuristic spin-control device.” More

  • in

    Study explores what motivates people to watch footage of disasters and extreme weather

    Extreme weather events such as hurricanes and storms have increased in both frequency and severity in recent years.
    With that has come heightened public interest, resulting in often dramatic footage being live-streamed on platforms such as YouTube, TikTok and Discord.
    Now, a new study conducted at the University of Plymouth has for the first time analysed what might be motivating people to watch these streams — in some instances for up to 12 hours at a time.
    The research centred around the live-streaming of three events — Hurricane Irma in 2017, Hurricane Ian in 2022, and Storms Dudley, Eunice and Franklin in 2022.
    Through a detailed analysis of viewers’ comments, it was found that people in affected areas were using streams to discuss official government risk advice they had received — for example, about whether to evacuate.
    Others were drawn to the streams because they had a previous connection to the affected region. For these people, watching live footage — which included taking time to share messages of ‘hope’ for the hurricane or storm to pass without destruction — was a way of showing support to places and people impacted by the event.
    The research was published in the journal Environmental Hazards and conducted by Dr Simon Dickinson, Lecturer in Geohazards and Risk in the University’s School of Geography, Earth and Environmental Sciences.

    He said: “When dramatic things happen — whether that relates to extreme weather or events like tornados or volcanoes erupting — people flock to watch. You might assume that this is just a form of online ‘rubber-necking, and that people are naturally drawn to spectacular sights. However, this study has shown that the drivers to watch extreme weather footage are more complex. Live-streams provide the opportunity for people in, close to, and far away from the event to interact in real time. The footage becomes a marker that people use to sense-check their understandings of how significant the event is, how hazards work, and as an online gathering point to share experiences of similar events. It is a fascinating insight into human behaviour that has previously been unexplored.”
    The research focused on nine live-streams of the 2017 and 2022 hurricanes and storms, which broadcast a total of 65 hours of video footage watched by more than 1.8 million people.
    During that time, over 14,300 comments were left by 5,000 unique accounts, a reflection of the fact that footage focused on unfolding events of national or global importance generate higher-than-normal audience engagement.
    Many of the streams were already existing webcam channels that were repurposed during the hurricane or storm, such as webcams that streamed beach or port conditions. In some instances, affected people streamed live-footage from their own house security or doorbell cameras.
    The study demonstrates that people are keen to learn more about the science behind what is happening, highlighting the need for further work that examines how people are using new technologies to make sense of hazard risk.
    Dr Dickinson added: “Although scientists are getting better at communicating risk, people are far more likely to discuss hazards in informal and relatively unmoderated settings. Moments of extreme weather are important because they focus people’s attention and generate discussion about hazards, how they work, and how they will increasingly affect us in future. New digital practices — such as live-streaming — are thus important for us to understand because they’re not just spaces of disaster voyeurism. Rather, they’re spaces of learning, community and emotional support in a world that can feel increasingly volatile.” More

  • in

    Exploring the chemical space of the exposome: How far have we gone?

    The open-access Journal of the American Chemical Society (JACS Au) has just published an invited perspective by Dr Saer Samanipour and his team on the daunting challenge of mapping all the chemicals around us. Samanipour, an Assistant Professor at the Van ‘t Hoff Institute for Molecular Sciences of the University of Amsterdam (UvA) takes inventory of the available science and concludes that currently a real pro-active chemical management is not feasible. To really get a grip on the vast and expanding chemical universe, Samanipour advocates the use of machine learning and AI, complementing existing strategies for detecting and identifying all molecules we are exposed to.
    In science lingo the aggregate of all the molecules we are exposed to is called the ‘exposome chemical space’ and it is central to Samanipour’s scientific endeavours. It is his mission to explore this vast molecular space and map it to the most ‘remote’ corners. He is driven by curiosity, but even more so by necessity. Direct and indirect exposure to a myriad of chemicals, mostly unknown, poses a significant threat to the human health. For instance, estimates are that 16% of global premature deaths are linked to pollution. The environment suffers as well, which can be seen, for example, in the loss of biodiversity. The case can be made, according to Samanipour, that humankind has surpassed the safe operating space for introducing human-made chemicals into the system of planet Earth.
    Current approach is inherently passive
    “It is rather unsatisfactory that we know so little about this,” he says. “We know little about the chemicals already in use, let alone that we can keep up with new chemicals that are currently being manufactured at an unprecedented rate.” In a previous study, he estimated that less than 2% of all chemicals that we are exposed to have been identified.
    “The way society approaches this issue is inherently passive and at best reactive. Only after we observe some sort of effects of exposure to chemicals do we feel the urge to analyse them. We attempt to determine their presence, their effect on the environment and human health, and we try to determine the mechanisms by which they cause any harm. This has led to many problems, the latest being the crisis with PFAS chemicals. But we have also seen major issues with flame retardants, PCBs, CFCs and so on.”
    Moreover, regulatory measures are predominantly aimed at chemicals with a very specific molecular structure that are produced in large quantities. “There are countless numbers of other chemicals out there where we don’t know much about. And these are not only man-made; nature also produces chemicals that can harm us. Through purely natural synthetic routes, or through the transformation of human-made chemicals.” In particular the latter category has been systematically overlooked according to Samanipour. “Conventional methods have catalogued only a fraction of the exposome, overlooking transformation products and often yielding uncertain results.”
    We need a data-driven approach
    The paper in JACS Au thoroughly reviews the latest efforts in mapping the exposome chemical space and discusses their results. A main bottleneck is that conventional chemical analysis is biased towards known or proposed structures, since this is key to interpreting data obtained with analytical methods such as chromatography and mass spectrometry (GC/LC-HRMS). Thus the more ‘unexpected’ chemicals are overlooked. This bias is avoided in so-called non-targeted analysis (NTA), but even then results are limited. Over the past 5 years, 1600 chemicals have been identified while every year around 700 new chemicals are introduced into the US market alone. Samanipour: “When you take the potential transformation products of these novel chemicals into account, you have to conclude that the speed of NTA studies is far too slow to be able to catch up. At this rate, our chemical exposome will continue to remain unknown.”

    The paper lists these and many more bottlenecks in current analytical science and suggest ways to improve results. In particular the use of machine learning and artificial intelligence will really push the field ahead, Samanipour argues. “We need a data-driven approach along several lines. Firstly, we should intensify the datamining efforts to distil information from existing chemical databases. Already recorded relations between structure, exposure and effect of identified chemicals will lead us to new insights. They could for instance help predict the health effects of related chemicals that are yet unidentified. Secondly, we have to perform retrospective analysis on already available analytical data obtained with established methods, expanding the identified chemical space. We will for sure find molecules there that have been overlooked until now. And thirdly, we can use AI to work on understanding the structure and scope of the exposome chemical space.”
    Work hard to tackle this
    Of course all this is a very complex, daunting matter, Samanipour realises. But as a sort of astronaut in molecular space — just like the explorers of the factual universe — he won’t let that complexity put him off. “We have to work hard to tackle this. I have no illusion that during my scientific career we will be able to fully chart the exposome chemical space. But it is imperative that we face its complexity, discuss it and take the first steps towards getting to grips with it.”
    Samanipour cooperated with colleagues at the UvA’s Institute for Biodiversity and Ecosystem Dynamics, the School of Public Health at Imperial College London (United Kingdom) and the Queensland Alliance for Environmental Health Sciences at the University of Queensland (Australia). The work was supported by the TKI ChemistryNL and the UvA Data Science Center, with additional local funding for the partners in the UK and Australia. More

  • in

    Crucial gaps in climate risk assessment methods

    A study by Stefano Battiston of the Department of Finance at the University of Zurich and his co-authors has identified critical shortcomings in the way climate-related risks to corporate assets are currently assessed. Many current estimates of climate physical climate risk rely on simplified and proxy data that do not accurately represent a company’s true risk exposure. This can lead to significant underestimates of climate-related losses, with serious implications for business investment planning, asset valuation and climate adaptation efforts.
    Potential losses up to 70% higher than previously estimated
    The research team developed a new methodology that uses detailed information about the location and characteristics of a company’s physical assets, such as factories, equipment and natural resources. This approach provides a more accurate picture of climate risks than methods that use proxy data, which often assume that all of a company’s assets are located at its headquarters. “When we compared our results with those using proxy data, we found that the potential losses from climate risks could be up to 70% higher than previously thought,” says Stefano Battiston. “This underscores the critical need for more granular data in risk assessments.”
    Preparing for the worst: The role of extreme events
    The authors also point to the importance of considering “tail risk” in climate assessments. Tail risk refers to the possibility of extreme events that, while rare, can have catastrophic impacts. “Many assessments focus on average impacts. Our research shows that the potential losses from extreme events can be up to 98% higher than these averages suggest,” says Stefano Battiston. “Failure to account for these possibilities can leave businesses and investors dangerously unprepared.”
    More funding for climate adaptation
    The study’s findings have significant implications for climate policy, business strategy, and investment decisions. The researchers emphasize that more accurate risk assessments are crucial for developing effective climate adaptation strategies and determining appropriate levels of climate-related insurance and funding. “Our work shows that we may be seriously underestimating the financial resources needed for climate adaptation,” concludes Stefano Battiston. More

  • in

    Moving beyond the 80-year-old solar cell equation

    Physicists from Swansea University and Åbo Akademi University have made a significant breakthrough in solar cell technology by developing a new analytical model that improves the understanding and efficiency of thin-film photovoltaic (PV) devices.
    For nearly eight decades, the so-called Shockley diode equation has explained how current flows through solar cells; the electrical current that powers up your home or charges the battery bank. However, the new study challenges this traditional understanding for a specific class of next-generation solar cells, namely: thin-film solar cells.
    These thin-film solar cells, made of flexible, low-cost materials have had limited efficiency due to factors that the existing analytical models couldn’t fully explain.
    The new study sheds light on how these solar cells achieve optimal efficiency. It reveals a critical balance between collecting the electricity generated by light and minimising losses due to recombination, where electrical charges cancel each other out.
    “Our findings provide key insights into the mechanisms driving and limiting charge collection, and ultimately the power-conversion efficiency, in low-mobility PV devices,” said the lead author, Dr Oskar Sandberg of Åbo Akademi University, Finland.
    New Model Captures the Missing Piece
    Previous analytical models for these solar cells had a blind spot: “injected carriers” — charges entering the device from the contacts. These carriers significantly impact recombination and limited efficiency.

    “The traditional models just weren’t capturing the whole picture, especially for these thin-film cells with low-mobility semiconductors,” explained the principal investigator, Associate Professor Ardalan Armin of Swansea University. “Our new study addresses this gap by introducing a new diode equation specifically tailored to account for these crucial injected carriers and their recombination with those photogenerated.”
    “The recombination between injected charges and photogenerated ones is not a huge problem in traditional solar cells such as silicon PV which is hundreds of times thicker than next generation thin film PV such as organic solar cells,” Dr Sandberg added.
    Associate Professor Armin said: “One of the brightest theoretical physicists of all times, Wolfgang Pauli once said ‘God made the bulk; the surface was the work of the devil’. As thin film solar cells have much bigger interfacial regions per bulk than traditional silicon; no wonder why they get affected more drastically by “the work of the devil” — that is recombination of precious photogenerated charges with injected ones near the interface!”
    Impact on Future Solar Cell Development
    This new model offers a new framework for designing more efficient thin solar cells and photodetectors, optimising existing devices, and analysing material properties. It can also aid in training machines used for device optimisation marking a significant step forward in the development of next-generation thin-film solar cells. More