More stories

  • in

    ‘Zombie’ forest fires may become more common with climate change

    Winter usually kills most forest fires. But in the boreal woods that encircle the far North, some fires, like zombies, just don’t die. 

    The first broad scientific look at overwintering “zombie fires” reveals these rare occurrences can flare up the year after warmer-than-normal summers and account for up to 38 percent of the total burn area in some regions, researchers report online May 19 in Nature. As climate change accelerates in boreal forests, the frequency of zombie fires could rise and exacerbate warming by releasing more greenhouse gases from the region’s soils, which may house twice as much carbon as Earth’s atmosphere (SN: 4/11/19).

    Zombie fires hibernate underground. Blanketed by snow, they smolder through the cold, surviving on the carbon-rich fuel of peat and boreal soil and moving very slowly — just 100 to 500 meters over the winter. Come spring, the fires reemerge near the forest they previously charred, burning fresh fuel well before the traditional fire season starts. Until now, these zombie fires have remained relatively mysterious to science, known mostly from firefighter anecdotes.

    Strange coincidences on satellite images, however, got the attention of earth systems scientist Rebecca Scholten and her colleagues. “My adviser noticed that some years, new fires were starting very close to the previous year’s fire,” says Scholten, of Vrije University Amsterdam. This is unusual, she says, since boreal fires are usually sparked by random lightning or human activity. Local fire managers confirmed that these were the same fires, prompting the researchers to wonder just how often fires overwinter.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    To find evidence of underground fires, the researchers combined firefighter reports with satellite images of Alaska and northern Canada captured from 2002 to 2018. They looked for blazes that started close to the scars left the previous year and that began before midsummer, when lightning-sparked fires usually occur.

    The team found that zombie fires are rare, accounting for 0.8 percent of the total area burned by forest fires in these regions over those 16 years, but there was lots of variability. In 2008, one zombie fire burned approximately 13,700 hectares in Alaska, about 38 percent of all burned areas that year in that state. Zombie fires were more likely to occur, and burn larger swaths of land, after warmer summers that allowed fires to reach deeper into the soil, the researchers found.

    Boreal forests are warming faster that the global average and “we’re seeing more hot summers and more large fires and intense burning,” Scholten says. That might set the stage for zombie fires to play a bigger role.

    “This is a really welcome advance which could help fire management,” says Jessica McCarty, a geographer at Miami University in Oxford, Ohio, who wasn’t involved in the study. Understanding when zombie fires are more likely to occur could help firefighters identify these areas early, she says, protecting fragile landscapes that house a lot of climate warming gases.

    “Some of these soils are thousands of years old,” McCarty says. While “areas we thought were fire resistant are now fire prone” due to climate change, she says, better fire management can make a difference. “We’re not helpless.” More

  • in

    Rising energy demand for cooling

    Due to climate change, the average global temperature will rise in the coming decades. This should also significantly increase the number of so-called cooling degree days. These measure the number of hours, in which the ambient temperature is above a certain threshold, at which a building must be cooled to keep the indoor temperature at a comfortable level. The rising values may lead to an increased installation of AC systems in households. This could lead to a higher energy demand for cooling buildings, which is already expected to increase due to climate change and population growth.
    Nip-and-tuck race between heating and cooling
    To get a better understanding of how massive this increase will be in Switzerland, Empa researchers analyzed the heating and cooling requirements of the NEST research and innovation building. “By including ambient temperatures, we were able to make a projection of the future thermal energy demand of buildings based on the climate scenarios for Switzerland. In addition to climate change, we also took population growth and the increasing use of AC devices into account,” explains Robin Mutschler, postdoc at Empa’s Urban Energy Systems lab.
    The results forecast a significant increase in the demand for cooling energy: In an extreme scenario where the whole of Switzerland would rely on air conditioning, almost as much energy would be needed for cooling as for heating by the middle of the century. In figures, this corresponds to about 20 terawatt hours (TWh) per year for heating and 17.5 TWh for cooling. The required cooling energy was calculated without regard to the technology. If this is provided by reversing a heat pump process, e.g. with COP 3 for cooling, the electricity demand for 17.5 TWh cooling energy is about 5.8 TWh.
    The heating demand of the residential units of NEST is comparable to a modern apartment building. These figures are therefore representative if is assumed that the average Swiss building is comparable to the NEST building. When this will be the case depends on the renovation rate. However, even in a more moderate scenario, the cooling demand in Switzerland will increase significantly. The researchers assume an additional energy demand of five TWh per year in this scenario.
    Strong impact on the Swiss energy system
    The energy demand of Swiss buildings today accounts for around 40 percent of the total energy demand. The main part of this is used for heating. This will probably remain this way until at least the middle of the 21st century, while the energy demand for cooling buildings is expected to increase significantly. If thermal energy is provided by heat pumps that can also cool, this potentially has a strong impact on the overall energy system and especially on electricity as an energy carrier.
    It is assumed that only a small amount of Swiss households currently own an AC unit or system. However, the number of houses with heat pumps is growing. The Empa researchers estimate that the number of households with cooling systems could rise to over 50 percent due to the increase in cooling degree days. This could lead to significant demand peaks on hot days. An additional five TWh of energy demand for cooling would be equivalent to about two percent of today’s electricity demand if cooling is provided by heat pumps. In the more extreme scenario, the electricity demand for cooling could even approach ten percent of today’s total demand. However, this will not be evenly distributed throughout the year, but will correlate with hot periods, which can lead to demand peaks. On a positive note, cooling demand is relatively well matched by electricity production from photovoltaic systems. The impact of cooling residential buildings will be significantly higher compared to office buildings, as they account for about two-thirds of the building area.
    Based on these findings, it is evident to the researchers that these developments must be taken into account when constructing new buildings and that the possibilities of passive cooling must be fully exploited. “Building architecture should no longer focus only on optimizing heat losses, especially in winter, but also on reducing heat gains in summer,” says Mutschler. This could be achieved, for example, through urban planning measures for climate adaptation at district level, the implementation of programs for heat reduction, or the reduction of glazing in buildings. “Moreover, it is crucial that policymakers also address this development and investigate ways to best meet the increasing cooling energy demand while minimizing the impact on the future decarbonized energy system,” Mutschler adds. One possible contribution to cooling buildings could come from district cooling systems, which have already been successfully implemented in Switzerland — for example in Geneva. Others are emerging, for instance in Zug. More

  • in

    AI predicts lung cancer risk

    An artificial intelligence (AI) program accurately predicts the risk that lung nodules detected on screening CT will become cancerous, according to a study published in the journal Radiology.
    Lung cancer is the leading cause of cancer death worldwide, with an estimated 1.8 million deaths in 2020, according to the World Health Organization. Low-dose chest CT is used to screen people at a high risk of lung cancer, such as longtime smokers. It has been shown to significantly reduce lung cancer mortality, primarily by helping to detect cancers at an early stage when they are easier to treat successfully.
    While lung cancer typically shows up as pulmonary nodules on CT images, most nodules are benign and do not require further clinical workup. Accurately distinguishing between benign and malignant nodules is therefore crucial to catch cancers early.
    For the new study, researchers developed an algorithm for lung nodule assessment using deep learning, an AI application capable of finding certain patterns in imaging data. The researchers trained the algorithm on CT images of more than 16,000 nodules, including 1,249 malignancies, from the National Lung Screening Trial. They validated the algorithm on three large sets of imaging data of nodules from the Danish Lung Cancer Screening Trial.
    The deep learning algorithm delivered excellent results, outperforming the established Pan-Canadian Early Detection of Lung Cancer model for lung nodule malignancy risk estimation. It performed comparably to 11 clinicians, including four thoracic radiologists, five radiology residents and two pulmonologists.
    “The algorithm may aid radiologists in accurately estimating the malignancy risk of pulmonary nodules,” said the study’s first author, Kiran Vaidhya Venkadesh, a Ph.D. candidate with the Diagnostic Image Analysis Group at Radboud University Medical Center in Nijmegen, the Netherlands. “This may help in optimizing follow-up recommendations for lung cancer screening participants.”
    The algorithm potentially brings several additional benefits to the clinic, the researchers said.
    “As it does not require manual interpretation of nodule imaging characteristics, the proposed algorithm may reduce the substantial interobserver variability in CT interpretation,” said senior author Colin Jacobs, Ph.D., assistant professor in the Department of Medical Imaging at Radboud University Medical Center in Nijmegen. “This may lead to fewer unnecessary diagnostic interventions, lower radiologists’ workload and reduce costs of lung cancer screening.”
    The researchers plan to continue improving the algorithm by incorporating clinical parameters like age, sex and smoking history.
    They are also working on a deep learning algorithm that takes multiple CT examinations as input. The current algorithm is highly suitable for analyzing nodules at the initial, or baseline, screening, but for nodules detected at subsequent screenings, growth and appearance in comparison to the previous CT are important.
    Dr. Jacobs and colleagues have developed other algorithms to reliably extract imaging features from the chest CT related to chronic obstructive pulmonary diseases and cardiovascular diseases. They will be investigating how to effectively integrate these imaging features into the current algorithm. More

  • in

    Spintronics: Improving electronics with finer spin control

    Spintronics is an emerging technology for manufacturing electronic devices that take advantage of electron spin and its associated magnetic properties, instead of using the electrical charge of an electron, to carry information. Antiferromagnetic materials are attracting attention in spintronics, with the expectation of spin operations with higher stability. Unlike ferromagnetic materials, in which atoms align along the same direction like in the typical refrigerator magnets, magnetic atoms inside antiferromagnets have antiparallel spin alignments that cancel out the net magnetization.
    Scientists have worked on controlling the alignment of magnetic atoms within antiferromagnetic materials to create magnetic switches. Conventionally, this has been done using a ‘field-cooling’ procedure, which heats and then cools a magnetic system containing an antiferromagnet, while applying an external magnetic field. However, this process is inefficient for use in many micro- or nano- structured spintronics devices because the spatial resolution of the process itself is not high enough to be applied in a micro- or nano-scale devices.
    “We discovered that we can control the antiferromagnetic state by simultaneously applying mechanical vibration and a magnetic field,” says Jung-Il Hong of DGIST’s Spin Nanotech Laboratory. “The process can replace the conventional heating and cooling approach, which is both inconvenient and harmful to the magnetic material. We hope our new procedure will facilitate the integration of antiferromagnetic materials into spintronics-based micro- and nano-devices.”
    Hong and his colleagues combined two layers: a cobalt-iron-boron ferromagnetic film on top of an iridium manganese antiferromagnetic film. The layers were grown on piezoelectric ceramic substrates. Combined application of mechanical vibration and a magnetic field allowed the scientists to control the alignments of magnetic spins repeatedly along any direction desired.
    The team aims to continue the search and development of new magnetic phases beyond conventionally classified magnetic materials. “Historically, new material discovery has led to the development of new technologies,” says Hong. “We want our research work to be a seed for new technologies.”
    Story Source:
    Materials provided by DGIST (Daegu Gyeongbuk Institute of Science and Technology). Note: Content may be edited for style and length. More

  • in

    Engineers harvest WiFi signals to power small electronics

    With the rise of the digital age, the amount of WiFi sources to transmit information wirelessly between devices has grown exponentially. This results in the widespread use of the 2.4GHz radio frequency that WiFi uses, with excess signals available to be tapped for alternative uses.
    To harness this under-utilised source of energy, a research team from the National University of Singapore (NUS) and Japan’s Tohoku University (TU) has developed a technology that uses tiny smart devices known as spin-torque oscillators (STOs) to harvest and convert wireless radio frequencies into energy to power small electronics. In their study, the researchers had successfully harvested energy using WiFi-band signals to power a light-emitting diode (LED) wirelessly, and without using any battery.
    “We are surrounded by WiFi signals, but when we are not using them to access the Internet, they are inactive, and this is a huge waste. Our latest result is a step towards turning readily-available 2.4GHz radio waves into a green source of energy, hence reducing the need for batteries to power electronics that we use regularly. In this way, small electric gadgets and sensors can be powered wirelessly by using radio frequency waves as part of the Internet of Things. With the advent of smart homes and cities, our work could give rise to energy-efficient applications in communication, computing, and neuromorphic systems,” said Professor Yang Hyunsoo from the NUS Department of Electrical and Computer Engineering, who spearheaded the project.
    The research was carried out in collaboration with the research team of Professor Guo Yong Xin, who is also from the NUS Department of Electrical and Computer Engineering, as well as Professor Shunsuke Fukami and his team from TU. The results were published in Nature Communications on 18 May 2021.
    Converting WiFi signals into usable energy
    Spin-torque oscillators are a class of emerging devices that generate microwaves, and have applications in wireless communication systems. However, the application of STOs is hindered due to a low output power and broad linewidth. More

  • in

    When one become two: Separating DNA for more accurate nanopore analysis

    A new software tool developed by Earlham Institute researchers will help bioinformaticians improve the quality and accuracy of their biological data, and avoid mis-assemblies. The fast, lightweight, user-friendly tool visualises genome assemblies and gene alignments from the latest next generation sequencing technologies.
    Called Alvis, the new visualisation tool examines mappings between DNA sequence data and reference genome databases. This allows bioinformaticians to more easily analyse their data generated from common genomics tasks and formats by producing efficient, ready-made vector images.
    First author and post-doctoral scientist at the Earlham Institute Dr Samuel Martin in the Leggett Group, said: “Typically, alignment tools output plain text files containing lists of alignment data. This is great for computer parsing and for being incorporated into a pipeline, but it can be difficult to interpret by humans.
    “Visualisation of alignment data can help us to understand the problem at hand. As a new technology, several new alignment formats have been implemented by new tools that are specific to nanopore sequencing technology.
    “We found that existing visualisation tools were not able to interpret these formats; Alvis can be used with all common alignment formats, and is easily extensible for future ones.”
    A key feature of the new command line tool is its unique ability to automatically highlight chimeric sequences — weak links in the DNA chain. This is where two sequences — from different parts of a genome or different species — are linked together by mistake to make one, affecting the data’s accuracy. More

  • in

    New material could create 'neurons' and 'synapses' for new computers

    Classic computers use binary values (0/1) to perform. By contrast, our brain cells can use more values to operate, making them more energy-efficient than computers. This is why scientists are interested in neuromorphic (brain-like) computing. Physicists from the University of Groningen (the Netherlands) have used a complex oxide to create elements comparable to the neurons and synapses in the brain using spins, a magnetic property of electrons. Their results were published on 18 May in the journal Frontiers in Nanotechnology.
    Although computers can do straightforward calculations much faster than humans, our brains outperform silicon machines in tasks like object recognition. Furthermore, our brain uses less energy than computers. Part of this can be explained by the way our brain operates: whereas a computer uses a binary system (with values 0 or 1), brain cells can provide more analogue signals with a range of values.
    Thin films
    The operation of our brains can be simulated in computers, but the basic architecture still relies on a binary system. That is why scientist look for ways to expand this, creating hardware that is more brain-like, but will also interface with normal computers. ‘One idea is to create magnetic bits that can have intermediate states’, says Tamalika Banerjee, Professor of Spintronics of Functional Materials at the Zernike Institute for Advanced Materials, University of Groningen. She works on spintronics, which uses a magnetic property of electrons called ‘spin’ to transport, manipulate and store information.
    In this study, her PhD student Anouk Goossens, first author of the paper, created thin films of a ferromagnetic metal (strontium-ruthenate oxide, SRO) grown on a substrate of strontium titanate oxide. The resulting thin film contained magnetic domains that were perpendicular to the plane of the film. ‘These can be switched more efficiently than in-plane magnetic domains’, explains Goossens. By adapting the growth conditions, it is possible to control the crystal orientation in the SRO. Previously, out-of-plane magnetic domains have been made using other techniques, but these typically require complex layer structures.
    Magnetic anisotropy
    The magnetic domains can be switched using a current through a platinum electrode on top of the SRO. Goossens: ‘When the magnetic domains are oriented perfectly perpendicular to the film, this switching is deterministic: the entire domain will switch.’ However, when the magnetic domains are slightly tilted, the response is probabilistic: not all the domains are the same, and intermediate values occur when only part of the crystals in the domain have switched.
    By choosing variants of the substrate on which the SRO is grown, the scientists can control its magnetic anisotropy. This allows them to produce two different spintronic devices. ‘This magnetic anisotropy is exactly what we wanted’, says Goossens. ‘Probabilistic switching compares to how neurons function, while the deterministic switching is more like a synapse.’
    The scientists expect that in the future, brain-like computer hardware can be created by combining these different domains in a spintronic device that can be connected to standard silicon-based circuits. Furthermore, probabilistic switching would also allow for stochastic computing, a promising technology which represents continuous values by streams of random bits. Banerjee: ‘We have found a way to control intermediate states, not just for memory but also for computing.’
    Story Source:
    Materials provided by University of Groningen. Note: Content may be edited for style and length. More

  • in

    Mathematical model predicts effect of bacterial mutations on antibiotic success

    Scientists have developed a mathematical model that predicts how the number and effects of bacterial mutations leading to drug resistance will influence the success of antibiotic treatments.
    Their model, described today in the journal eLife, provides new insights on the emergence of drug resistance in clinical settings and hints at how to design novel treatment strategies that help avoid this resistance occurring.
    Antibiotic resistance is a significant public health challenge, caused by changes in bacterial cells that allow them to survive drugs that are designed to kill them. Resistance often occurs through new mutations in bacteria that arise during the treatment of an infection. Understanding how this resistance emerges and spreads through bacterial populations is important to preventing treatment failure.
    “Mathematical models are a crucial tool for exploring the outcome of drug treatment and assessing the risk of the evolution of antibiotic resistance,” explains first author Claudia Igler, Postdoctoral Researcher at ETH Zurich, Switzerland. “These models usually consider a single mutation, which leads to full drug resistance, but multiple mutations that increase antibiotic resistance in bacteria can occur. So there are some mutations that lead to a high level of resistance individually, and some that provide a small level of resistance individually but can accumulate to provide high-level resistance.”
    For their study, Igler and her team gathered experimental evidence that drug resistance evolution follows these two patterns: a single mutation and multiple mutations. They then used this information to create an informed modelling framework which predicts the evolution of ‘single-step’ resistance versus ‘multi-step’ resistance in bacteria cells in response to drug type, pharmacokinetics (how the drug decays in the body), and treatment strategies. They investigated how the risk of treatment failure changes when taking into account multiple mutational steps, instead of a single one, and how many different bacterial lineages (bacteria with different mutations) would emerge during the treatment period.
    Using their model, the team found that the evolution of drug resistance is limited substantially if more than two mutations are required by the bacteria. Additionally, the extent of this limitation, and therefore the probability of treatment failure, depends strongly on the combination of the drug type and the route of administration, such as orally or via IV infusion.
    “Our work provides a crucial step in understanding the emergence of antibiotic resistance in clinically relevant treatment settings,” says senior author Roland Regoes, Group Leader at ETH Zurich. “Together, our findings highlight the importance of measuring the level of antibiotic resistance granted by single mutations to help inform effective antimicrobial treatment strategies.”
    Story Source:
    Materials provided by eLife. Note: Content may be edited for style and length. More