More stories

  • in

    Looking to move to a galaxy far, far away? Innovative system evaluates habitability of distant planets

    The climate crisis presents a huge challenge to all people on Earth. It has led many scientists to look for exoplanets, planets outside our solar system that humans could potentially settle.
    The James Webb Space Telescope was developed as part of this search to provide detailed observational data about Earth-like exoplanets in the coming years. A new project, led by Dr. Assaf Hochman at the Fredy & Nadine Herrmann Institute of Earth Sciences at the Hebrew University of Jerusalem (HU), in collaboration with Dr. Paolo De Luca at the Barcelona Supercomputing Center and Dr. Thaddeus D. Komacek at the University of Maryland, has successfully developed a framework to study the atmospheres of distant planets and locate those planets fit for human habitation, without having to visit them physically. Their joint research study was published in the Astrophysical Journal.
    Classifying climate conditions and measuring climate sensitivity are central elements when assessing the viability of exoplanets as potential candidates for human habitation. In the current study, the research team examined TRAPPIST-1e, a planet located some 40 light years from the Earth and scheduled to be documented by the James Webb Space Telescope in the coming year. The researchers looked at the sensitivity of the planet’s climate to increases in greenhouse gases and compared it with conditions on Earth. Using a computerized simulation of the climate on TRAPPIST-1e, they could assess the impact of changes in greenhouse gas concentration.
    The study focused on the effect of an increase in carbon dioxide on extreme weather conditions, and on the rate of changes in weather on the planet. “These two variables are crucial for the existence of life on other planets, and they are now being studied in depth for the first time in history,” explained Hochman.
    According to the research team, studying the climate variability of earth-like exoplanets provides a better understanding of the climate changes we are currently experiencing on Earth. Additionally, this kind of research offers a new understanding of how planet Earth’s atmosphere might change in the future.
    Hochman and his research partners found that planet TRAPPIST-1e has a significantly more sensitive atmosphere than planet Earth. They estimate that an increase in greenhouse gases there could lead to more extreme climate changes than we would experience here on Earth because one side of TRAPPIST-1e constantly faces its own sun, in the same way, that our moon always has one side facing the Earth.
    As Hochman concluded, “the research framework we developed, along with observational data from the Webb Space Telescope, will enable scientists to efficiently assess the atmospheres of many other planets without having to send a space crew to visit them physically. This will help us make informed decisions in the future about which planets are good candidates for human settlement and perhaps even to find life on those planets.”
    Story Source:
    Materials provided by The Hebrew University of Jerusalem. Note: Content may be edited for style and length. More

  • in

    High entropy alloys: Structural disorder and magnetic properties

    High entropy alloys or HEAs consist of five or more different metallic elements and are an extremely interesting class of materials with a great diversity of potential applications. Since their macroscopic properties are strongly dependent on interatomic interactions, it is utterly interesting to probe the local structure and structural disorder around each individual element by element-specific techniques. Now, a team has examined a so called Cantor alloy — a model system to study the high-entropy effects on the local and macroscopic scales.
    A toolbox at BESSY II
    To investigate the local environment of individual components, the team used multi-edge X-ray absorption spectroscopy (EXAFS) at BESSY II and then the reverse Monte Carlo method to analyse the collected data. The magnetic properties of each element of the alloy were additionally probed using X-ray magnetic circular dichroism (XMCD) technique. By conventional magnetometry, the scientists proved the presence of magnetic phase transitions and found some signatures of a complex magnetic ordering with a coexistence of different magnetic phases.
    Common trends in bulk and nanofilm samples
    The results from the examined nanocrystalline film made of this alloy demonstrate some common trends as compared to a bulk sample, e.g., the largest lattice relaxations of Chromium and still intriguing magnetic behaviour of Manganese, which are consistent with the macroscopic magnetic behaviour of the film.
    “High-entropy alloys are an extremely diverse and exciting class of materials,” says Dr. Alevtina Smekhova, physicist at HZB and first author of the paper. “By probing the behaviour of individual components at the atomic scale, we would gain valuable clues for the further development of new complex systems with the desired multifunctionality,” she says. More

  • in

    Advances in water-splitting catalysts

    Creating a hydrogen economy is no small task, but Rice University engineers have discovered a method that could make oxygen evolution catalysis in acids, one of the most challenging topics in water electrolysis for producing clean hydrogen fuels, more economical and practical.
    The lab of chemical and biomolecular engineer Haotian Wang at Rice’s George R. Brown School of Engineering has replaced rare and expensive iridium with ruthenium, a far more abundant precious metal, as the positive-electrode catalyst in a reactor that splits water into hydrogen and oxygen.
    The lab’s successful addition of nickel to ruthenium dioxide (RuO2) resulted in a robust anode catalyst that produced hydrogen from water electrolysis for thousands of hours under ambient conditions.
    “There’s huge industry interest in clean hydrogen,” Wang said. “It’s an important energy carrier and also important for chemical fabrication, but its current production contributes a significant portion of carbon emissions in the chemical manufacturing sector globally. We want to produce it in a more sustainable way, and water-splitting using clean electricity is widely recognized as the most promising option.”
    Iridium costs roughly eight times more than ruthenium, he said, and it could account for 20% to 40% of the expense in commercial device manufacturing, especially in future large-scale deployments.
    The process developed by Wang, Rice postdoctoral associate Zhen-Yu Wu and graduate student Feng-Yang Chen, and colleagues at the University of Pittsburgh and the University of Virginia is detailed in Nature Materials.
    Water splitting involves the oxygen and hydrogen evolution reactions by which polarized catalysts rearrange water molecules to release oxygen and hydrogen. “Hydrogen is produced by the cathode, which is a negative electrode,” Wu said. “At the same time, it has to balance the charge by oxidizing water to generate oxygen on the anode side.” More

  • in

    Smartphone data can help create global vegetation maps

    Missing knowledge in the global distribution of plant traits could be filled with data from species identification apps. Researchers from Leipzig University, the German Centre for Integrative Biodiversity Research (iDiv) and other institutions were able to demonstrate this based on data from the popular iNaturalist app. Supplemented with data on plant traits, iNaturalist input results in considerably more precise maps than previous approaches based on extrapolation from limited databases. Among other things, the new maps provide an improved basis for understanding plant-environment interactions and for Earth system modelling. The study has been published in the journal Nature Ecology and Evolution.
    Nature and climate are mutually dependent. Plant growth is absolutely dependent on climate, but this is, in turn, strongly influenced by plants, such as in a forest, which evaporates a lot of water. In order to be able to make accurate predictions about how the living world may develop, extensive knowledge of the characteristics of the vegetation at the different locations is necessary, for example, leaf surface size, tissue properties and plant height. However, such data usually have to be recorded manually by professional scientists in a painstaking, time-consuming process. Consequently, the available worldwide plant trait data are very sparse and cover only certain regions.
    The TRY database, managed by iDiv and the Max Planck Institute for Biogeochemistry in Jena, currently provides such data on plant traits for almost 280,000 plant species. This makes it one of the most comprehensive databases for plant characteristics mapping in the world. Up to now, global maps of plant traits have been created using extrapolations (estimation beyond the original observation range) from this geographically limited database. However, the resulting maps are not particularly reliable.
    In order to fill large data gaps, the Leipzig researchers have now taken a different approach. Instead of extrapolating existing trait data geographically from the TRY database, they have linked it to the vast dataset from the citizen science project iNaturalist.
    With iNaturalist, users of the associated smartphone app share their observations of nature, providing species names, photos and geolocation. In this way, more than 19 million data points have been recorded, worldwide, for terrestrial plants alone. The data also feeds the world’s largest biodiversity database, the Global Biodiversity Information Facility (GBIF). This is accessible to the public and also serves as an important database for biodiversity research.
    In order to test the accuracy of the maps based on the combination of iNaturalist observations and TRY plant traits, they were compared to the plant trait evaluations based on sPlotOpen; the iDiv sPlot platform is the world’s largest archive of plant community data. It contains nearly two million datasets with complete lists of plant species which occur in the locations (plots) studied by professional researchers. The database is also enhanced with plant trait data from the TRY database.
    The conclusion: The new iNaturalist-based map corresponded to the sPlot data map significantly more closely than previous map products based on extrapolation. “That the new maps, based on the citizen science data, seem to be even more precise than the extrapolations was both surprising and impressive,” says first author Sophie Wolf, a doctoral researcher at Leipzig University. “Particularly because iNaturalist and our reference sPlotOpen are very different in structure.”
    “Our study convincingly demonstrates the potential for research into voluntary data,” says last author, Dr Teja Kattenborn from Leipzig University and iDiv. “It is encouraging to make increasing use of the synergies between the combined data from thousands of citizens and professional scientists.”
    “This work is the result of an initiative of the National Research Data Infrastructure for Biodiversity Research (NFDI4Biodiversity), with which we are pushing for a change in culture towards the open provision of data,” says co-author Prof Miguel Mahecha, head of the working group Modelling Approaches in Remote Sensing at Leipzig University and iDiv. “The free availability of data is an absolute prerequisite for a better understanding of our planet.” More

  • in

    Heat waves in U.S. rivers are on the rise. Here’s why that’s a problem

    U.S. rivers are getting into hot water. The frequency of river and stream heat waves is on the rise, a new analysis shows.

    Like marine heat waves, riverine heat waves occur when water temperatures creep above their typical range for five or more days (SN: 2/1/22). Using 26 years of United States Geological Survey data, researchers compiled daily temperatures for 70 sites in rivers and streams across the United States, and then calculated how many days each site experienced a heat wave per year. From 1996 to 2021, the annual average number of heat wave days per river climbed from 11 to 25, the team reports October 3 in Limnology and Oceanography Letters.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    The study is the first assessment of heat waves in rivers across the country, says Spencer Tassone, an ecosystem ecologist at the University of Virginia in Charlottesville. He and his colleagues tallied nearly 4,000 heat wave events — jumping from 82 in 1996 to 198 in 2021 — and amounting to over 35,000 heat wave days. The researchers found that the frequency of extreme heat increased at sites above reservoirs and in free-flowing conditions but not below reservoirs — possibly because dams release cooler water downstream.

    Most heat waves with temperatures the highest above typical ranges occurred outside of summer months between December and April, pointing to warmer wintertime conditions, Tassone says.

    Human-caused global warming plays a role in riverine heat waves, with heat waves partially tracking air temperatures — but other factors are probably also driving the trend. For example, less precipitation and lower water volume in rivers mean waterways warm up easier, the study says.

    “These very short, extreme changes in water temperature can quickly push organisms past their thermal tolerance,” Tassone says. Compared with a gradual increase in temperature, sudden heat waves can have a greater impact on river-dwelling plants and animals, he says. Fish like salmon and trout are particularly sensitive to heat waves because the animals rely on cold water to get enough oxygen, regulate their body temperature and spawn correctly.

    There are chemical consequences to the heat as well, says hydrologist Sujay Kaushal of the University of Maryland in College Park who was not involved with the study. Higher temperatures can speed up chemical reactions that contaminate water, in some cases contributing to toxic algal blooms (SN: 2/7/18). 

    The research can be used as a springboard to help mitigate heat waves in the future, Kaushal says, such as by increasing shade cover from trees or managing stormwater. In some rivers, beaver dams show promise for reducing water temperatures (SN: 8/9/22). “You can actually do something about this.” More

  • in

    Virtual reality experiences to aid substance use disorder recovery

    Indiana University researchers are combining psychological principles with innovative virtual reality technology to create a new immersive therapy for people with substance use disorders. They’ve recently received over $4.9 million from the National Institutes of Health and launched an IU-affiliated startup company to test and further develop the technology.
    Led by Brandon Oberlin, an assistant professor of psychiatry at the IU School of Medicine, IU researchers have built a virtual environment using “future-self avatars” to help people recover from substance use disorders. These avatars are life-sized, fully animated and nearly photo realistic. People can converse with their avatars, who speak in their same voice using personal details in alternate futures.
    “VR technology is clinically effective and increasingly common for treating a variety of mental health conditions, such as phobias, post-traumatic stress disorder and post-operative pain, but has yet to find wide use in substance use disorders intervention or recovery,” Oberlin said. “Capitalizing on VR’s ability to deliver an immersive experience showing otherwise-impossible scenarios, we created a way for people to interact with different versions of their future selves in the context of substance use and recovery.”
    After four years of development and testing in collaboration with Indianapolis-based treatment centers, Oberlin and his colleagues’ pilot study was published Sept. 15 in Discover Mental Health. Their findings suggest that virtual reality simulation of imagined realities can aid substance use disorder recovery by lowering the risk of relapse rates and increasing participants’ future self-connectedness.
    “This experience enables people in recovery to have a personalized virtual experience, in alternate futures resulting from the choices they made,” Oberlin said. “We believe this could be a revolutionary intervention for early substance use disorders recovery, with perhaps even further-reaching mental health applications.”
    The technology is particularly well-suited for people in early recovery — a crucial time as there is a high risk for relapse — because the immersive experiences can help them choose long-term rewards over immediate gratification by deepening connections to their future selves, he said. More

  • in

    New data registry collects evidence in cardiogenic shock patients

    Cardiogenic shock — a life threatening condition when a person’s heart can’t pump enough blood to meet the needs of the body — is most often caused by serious heart attack or advanced heart failure. Historically, data related to cardiogenic shock have been limited, inconsistent and challenging to interpret. As a result, varying treatment recommendations exist around best practices.
    To address this need, the American Heart Association, the leading voluntary organization devoted to longer, healthier lives for all, created the Cardiogenic Shock Registry powered by Get With The Guidelines®. The new registry will help researchers, clinicians and regulators to better understand the clinical symptoms of shock types, treatment patterns and outcomes. The registry will provide a foundation for working toward improving the quality and consistency of care in patients in U.S. hospitals with cardiogenic shock symptoms.
    “To understand how to improve care for cardiogenic shock patients, we first need a clearer view of the landscape of existing treatment practices for cardiogenic shock in U.S.-based acute care settings,” said Mitchell Krucoff, M.D., FAHA, volunteer expert for the American Heart Association and professor of medicine at Duke University, Durham, N.C.”No organization is better positioned to advance this critical public health question than the American Heart Association, with already established networks of sites entering data on heart failure, acute cardiac syndromes, cardiac arrest and COVID — all of which involve patients at risk of progressing to cardiogenic shock.”
    The Cardiogenic Shock Registry builds on more than 20 years of quality improvement and registry experience rooted in the Association’s Get With The Guidelines® platform. Data from this no-cost registry will help inform the larger medical community on how best to treat cardiogenic shock.
    The steering committee of the American Heart Association Cardiogenic Shock Registry provides guidance and expertise for establishing the registry and managing the data. The steering committee includes leading academic surgeons and cardiologists, representatives from founding funders, as well as representatives of the U.S. Food & Drug Administration and the U.S. Centers for Medicare & Medicaid Services.
    The American Heart Association’s Cardiogenic Shock Registry is made possible through the generous financial support of founding supporters Abbott and Getinge.
    “The new Cardiogenic Shock Registry will leverage the unparalleled reach of the American Heart Association in a unique collaboration between academic clinicians and researchers, federal agencies and funding supporters’ experts to provide high-quality evidence and promote best practices for the treatment of patients with cardiogenic shock,” said David Morrow, M.D., M.P.H., FAHA, volunteer expert for the American Heart Association and professor of medicine, Harvard Medical School, Boston.
    Story Source:
    Materials provided by American Heart Association. Note: Content may be edited for style and length. More

  • in

    Number-crunching mathematical models may give policy makers major headache

    Mathematical models that predict policy-driving scenarios — such as how a new pandemic might spread or the future amount of irrigation water needed worldwide — may be too complex and delivering ‘wrong’ answers, a new study reveals.
    Experts are using increasingly detailed models to better predict phenomena or gain more accurate insights in a range of key areas, such as environmental/climate sciences, hydrology and epidemiology.
    But the pursuit of complex models as tools to produce more accurate projections and predictions may not deliver because more complicated models tend to produce more uncertain estimates.
    Researchers from the Universities of Birmingham, Princeton, Reading, Barcelona and Bergen published their findings today in Science Advances. They reveal that expanding models without checking how extra detail adds uncertainty limits the models’ usefulness as tools to inform policy decisions in the real world.
    Arnald Puy, Associate Professor in Social and Environmental Uncertainties at the University of Birmingham, commented: “As science keeps on unfolding secrets, models keep getting bigger — integrating new discoveries to better reflect the world around us. We assume that more detailed models produce better predictions because they better match reality.
    “And yet pursuing ever-complex models may not deliver the results we seek, because adding new parameters brings new uncertainties into the model. These new uncertainties pile on top of the uncertainties already there at every model upgrade stage, making the model’s output fuzzier at every step of the way.”
    This tendency to produce more inaccurate results affects any model without training or validation data used to check its output’s accuracy — affecting all global models such as those focused on climate-change, hydrology, food-production, and epidemiology, as well as models projecting estimates into the future, regardless of the scientific field.
    Researchers recommend that the drive to produce increasingly detailed mathematical models as a means to get sharper estimates should be reassessed.
    “We suggest that modelers should calculate the model’s effective dimensions (the number of influential parameters and their highest-order interaction) before making the model more complex. This allows to check how the addition of model complexity affects the uncertainty in the output. Such information is especially valuable for models aiming to play a role in policy making,” added Dr. Puy. “Both modelers and policy makers benefit from understanding any uncertainty generated when a model is upgraded with novel mechanisms.
    “Modelers tend not to submit their models to uncertainty and sensitivity analysis but keep on adding detail. Not many scholars are interested running such an analysis on their model if it risks showing that the emperor runs naked and its alleged sharp estimates are just a mirage.”
    Excess complexity prevents scholars and public alike to ponder the appropriateness of the models’ assumptions, often highly questionable. Puy and his team note, for example, that global hydrological models assume that irrigation optimises crop production and water use — a premise at odds with practices of traditional irrigators.
    Story Source:
    Materials provided by University of Birmingham. Note: Content may be edited for style and length. More