More stories

  • in

    Scientists develop 'optimal strategies' computer model that could significantly reduce future COVID-19 infections and deaths

    A team of scientists from Nanyang Technological University, Singapore (NTU Singapore) has developed a predictive computer model that, when tested on real pandemic data, proposed strategies that would have reduced the rate of both COVID-19 infections and deaths by an average of 72 per cent, based on a sample from four countries.
    The model, called NSGA-II, could be used to alert local governments in advance on possible surges in COVID-19 infections and mortalities, allowing them time to put forward relevant counter measures more rapidly.
    Through the testing of NSGA-II in four Asian countries using data available from 1 January 2020 to 31 December 2020, the team demonstrated that it could have helped reduce the number of COVID-19 infections and deaths by up to 76 per cent in Japan, 65 per cent in South Korea, 59 per cent in Pakistan, and 89 per cent in Nepal.
    The computer model achieved the result by recommending timely and country-specific advice on the optimal application and duration of COVID-19 interventions, such as home quarantines, social distancing measures, and personal protective measures that would help to thwart the negative impact of the pandemic.
    The team also showed NSGA-II could make predictions on the daily increases of COVID-19 confirmed cases and deaths that were highly accurate, at a confidence level of 95 per cent, compared to the actual cases that took place in the four countries over the past year.
    Harnessing the power of machine learning, the research team developed NSGA-II by inputting large amounts of data on COVID-19 mortalities and infections worldwide that is available for the whole of 2020, helping it learn the dynamics of the pandemic. The research was reported in the peer-reviewed scientific journal Sustainable Cities and Society in August. More

  • in

    Australian fires in 2019–2020 had even more global reach than previously thought

    The severe, devastating wildfires that raged across southeastern Australia in late 2019 and early 2020 packed a powerful punch that extended far beyond the country, two new studies find.

    The blazes injected at least twice as much carbon dioxide into the atmosphere as was previously thought, one team’s satellite-derived estimates revealed. The fires also sent up vast clouds of smoke and ash that wafted far to the east over the Southern Ocean, fertilizing the waters with nutrients and triggering widespread blooms of microscopic marine algae called phytoplankton, another team found. Both studies were published online September 15 in Nature.

    Meteorologist Ivar van der Velde of the SRON Netherlands Institute for Space Research in Leiden and colleagues first examined carbon monoxide data collected over southeastern Australia by the satellite-based instrument TROPOMI from November 2019 to January 2020, during the worst of the fires. Then, to get new estimates of the carbon dioxide emissions attributable to the fires, the team used previously determined ratios of carbon monoxide to carbon dioxide emitted by the region’s eucalyptus forests — the predominant type of forest that was scorched in the blazes — during earlier wildfires and prescribed burns.

    Van der Velde’s team estimates that the fires released from 517 trillion to 867 trillion grams of carbon dioxide to the atmosphere. “The sheer magnitude of CO2 that was emitted to the atmosphere … was much larger than what we initially thought it would be,” van der Velde says. The emissions “from this single event were significantly higher than what all Australians normally emit with the combustion of fossil fuels in an entire year.”

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    Previous assessments of CO2 emissions from the fires, based on estimations of burned area and biomass consumed by the blazes, calculated an average of about 275 trillion grams. Using the satellite-derived carbon monoxide data, the researchers say, dramatically improves the ability to distinguish actual emissions from the fires from other background sources of the gases, giving a more accurate assessment.

    That finding has worrisome implications. The fires swiftly cut a swath through southeastern Australia’s eucalyptus forests, devastating the forests to a degree that made their rapid recovery more difficult — which in turn affects how much carbon the trees can sequester, van der Velde says (SN: 3/9/21). Fires in northern and central Australia’s dry, grassy savannas are seen as more climate neutral because the grasses can regrow more quickly, he says.

    And severe fire seasons are likely to become more common in southeastern Australia with ongoing climate change. Climate change has already increased the likelihood of severe fire events such as the 2019–2020 fire season by at least 30 percent (SN: 3/4/20).

    The smoke and ash from the fires also packed a powerful punch. Scientists watched in awe as the fires created a “super outbreak” of towering thunderclouds from December 29 to December 31 in 2019 (SN: 12/15/20). These clouds spewed tiny aerosol particles of ash and smoke high into the stratosphere.

    Aerosols from the fires also traveled eastward through the lower atmosphere, ultimately reaching the Southern Ocean where they triggered blooms of phytoplankton in its iron-starved waters. Geochemist Weiyi Tang, now at Princeton University, and colleagues analyzed aerosols from the fires and found the particles to be rich in iron, an important nutrient for the algae. By tracing the atmospheric paths of the cloud of ash and smoke across the ocean, the team was able to link the observed blooms — huge patches of chlorophyll detected by satellite — to the fires.

    A satellite image snapped on January 6, 2020, shows smoke from southeastern Australia’s wildfires wafting eastward over the Southern Ocean.Japan’s National Institute of Information and Communication Technology

    Researchers have long thought that fires can trigger ocean blooms, particularly in the Southern Ocean, under the right conditions, says marine biogeochemist Joan Llort, now at the Barcelona Supercomputing Center and a coauthor on the study. But this research marks the most direct observation ever made of such an event — in part because it was such a massive one, Llort says.

    Large ocean blooms are “yet another process which is potentially being modified by climate change,” says biogeochemist Nicolas Cassar of Duke University, also a coauthor on the study.

    One of the big questions to emerge from the study, Cassar adds, is just how much carbon these phytoplankton may have ultimately removed from the atmosphere as they bloomed. Some of the carbon that the algae draw out of the air through photosynthesis sinks with them to the seafloor as they die. But some of it is quickly respired back to the atmosphere, muting any mitigating effect that the blooms might have on the wildfire emissions. To really assess what role the algae play, he says, would require a rapid-response team aboard an ocean vessel that could measure these chemical processes as they are happening.

    The sheer size of this wildfire-triggered bloom — “larger than Australia itself” — shows that “wildfires have the potential to increase marine productivity by very large amounts,” says Douglas Hamilton, a climate scientist at Cornell University who was not connected with the study.

    “The impact of fires on society is not straightforward,” Hamilton adds. The same smoke that can cause severe health impacts when inhaled “is also supplying nutrients to ecosystems and helping support marine food webs.” What this study demonstrates, he adds, is that to understand how future increases in fire activity might help shape the future of marine productivity “it is crucial that we monitor the impacts closely now.” More

  • in

    New DNA-based chip can be programmed to solve complex math problems

    The field of DNA computing has evolved by leaps and bounds since it was first proposed nearly 30 years ago. But most DNA computing processes are still performed manually, with reactants being added step-by-step to the reaction by hand. Now, finally, scientists at Incheon National University, Korea have found a way to automate DNA calculations by developing a unique chip that can be controlled by a personal computer.
    The term ‘DNA’ immediately calls to mind the double-stranded helix that contains all our genetic information. But the individual units of its two strands are pairs of molecules bonded with each other in a selective, complementary fashion. Turns out, one can take advantage of this pairing property to perform complex mathematical calculations, and this forms the basis of DNA computing.
    Since DNA has only two strands, performing even a simple calculation requires multiple chemical reactions using different sets of DNA. In most existing research, the DNA for each reaction are added manually, one by one, into a single reaction tube, which makes the process very cumbersome. Microfluidic chips, which consist of narrow channels etched onto a material like plastic, offer a way to automate the process. But despite their promise, the use of microfluidic chips for DNA computing remains underexplored.
    In a recent article — made available online in ACS Nano on 7 July 2021 and published in Volume 15 Issue 7 of the journal on 27 July 2021 — a team of scientists from Incheon National University (INU), Korea, present a programmable DNA-based microfluidic chip that can be controlled by a personal computer to perform DNA calculations. “Our hope is that DNA-based CPUs will replace electronic CPUs in the future because they consume less power, which will help with global warming. DNA-based CPUs also provide a platform for complex calculations like deep learning solutions and mathematical modelling,” says Dr. Youngjun Song from INU, who led the study.
    Dr. Song and team used 3D printing to fabricate their microfluidic chip, which can execute Boolean logic, one of the fundamental logics of computer programming. Boolean logic is a type of true-or-false logic that compares inputs and returns a value of ‘true’ or ‘false’ depending on the type of operation, or ‘logic gate,’ used. The logic gate in this experiment consisted of a single-stranded DNA template. Different single-stranded DNA were then used as inputs. If part of an input DNA had a complementary Watson-Crick sequence to the template DNA, it paired to form double-stranded DNA. The output was considered true or false based on the size of the final DNA.
    What makes the designed chip extraordinary is a motor-operated valve system that can be operated using a PC or smartphone. The chip and software set-up together form a microfluidic processing unit (MPU). Thanks to the valve system, the MPU could perform a series of reactions to execute a combination of logic operations in a rapid and convenient manner.
    This unique valve system of the programmable DNA-based MPU paves the way for more complex cascades of reactions that can code for extended functions. “Future research will focus on a total DNA computing solution with DNA algorithms and DNA storage systems,” says Dr. Song.
    Story Source:
    Materials provided by Incheon National University. Note: Content may be edited for style and length. More

  • in

    Finding a metal-oxide needle in a periodic table haystack

    I went to Caltech, and all I got was this T-shirt … and a new way to discover complex and interesting materials.
    Coupling computer automation with an ink-jet printer originally used to print T-shirt designs, researchers at Caltech and Google have developed a high-throughput method of identifying novel materials with interesting properties. In a trial run of the process, they screened hundreds of thousands of possible new materials and discovered one made from cobalt, tantalum, and tin that has tunable transparency and acts as a good catalyst for chemical reactions while remaining stable in strong acid electrolytes.
    The effort, described in a scientific article published in Proceedings of the National Academy of Sciences(PNAS), was led by John Gregoire and Joel Haber of Caltech, and Lusann Yang of Google. It builds on research conducted at the Joint Center for Artificial Photosynthesis (JCAP), a Department of Energy (DOE) Energy Innovation Hub at Caltech, and continues with JCAP’s successor, the Liquid Sunlight Alliance (LiSA), a DOE-funded effort that aims to streamline the complicated steps needed to convert sunlight into fuels, to make that process more efficient.
    Creating new materials is not as simple as dropping a few different elements into a test tube and shaking it up to see what happens. You need the elements that you combine to bond with each other at the atomic level to create something new and different rather than just a heterogeneous mixture of ingredients. With a nearly infinite number of possible combinations of the various squares on the periodic table, the challenge is knowing whichcombinations will yield such a material.
    “Materials discovery can be a bleak process. If you can’t predict where to find the desired properties, you could spend your entire career mixing random elements and never find anything interesting,” says Gregoire, research professor of applied physics and materials science, researcher at JCAP, and LiSA team lead.
    When combining a small number of individual elements, materials scientists can often make predictions about what properties a new material might have based on its constituent parts. However, that process quickly becomes untenable when more complicated mixtures are made. More

  • in

    Potty-trained cattle could help reduce pollution

    You can lead a cow to a water closet, but can you make it pee there? It turns out that yes, you can.

    Researchers in Germany successfully trained cows to use a small, fenced-in area with artificial turf flooring as a bathroom stall. This could allow farms to easily capture and treat cow urine, which often pollutes air, soil and water, researchers report online September 13 in Current Biology. Components of that urine, such as nitrogen and phosphorus, could also be used to make fertilizer (SN: 4/6/21).

    The average cow can pee tens of liters per day, and there are some 1 billion cattle worldwide. In barns, cow pee typically mixes with poop on the floor to create a slurry that emits the air pollutant ammonia (SN: 1/4/19). Out in pastures, cow pee can leach into nearby waterways and release the potent greenhouse gas nitrous oxide (SN: 6/9/14).

    “I’m always of the mind, how can we get animals to help us in their management?” says Lindsay Matthews, a self-described cow psychologist who studies animal behavior at the University of Auckland in New Zealand. Matthews and colleagues set out to potty train 16 calves, which had the free time to learn a new skill. “They’re not so involved with milking and other systems,” he says. “They’re basically just hanging out, eating a bit of food, socializing and resting.”

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    Matthews was optimistic about the cows’ potty-training prospects. “I was convinced that we could do it,” he says. Cows “are much, much smarter than people give them credit for.” Each calf got 45 minutes of what the team calls “MooLoo training” per day. At first, the researchers enclosed the calves inside the makeshift bathroom stall and fed the animals a treat every time they peed.

    Once the calves made the connection between using the bathroom stall and receiving a treat, the team positioned the calves in a hallway leading to the stall. Whenever animals visited the little cows’ room, they got a treat; whenever calves peed in the hallway, the team spritzed them with water. “We had 11 of the 16 calves [potty trained] within about 10 days,” Matthews says. The remaining cows “are probably trainable too,” he adds. “It’s just that we didn’t have enough time.”

    [embedded content]
    Researchers successfully trained 11 calves, such as this one, to urinate in a bathroom stall. Once the cow relieved itself, a window in the stall opened, dispensing a molasses mixture as a treat. Toilet training cows on a large scale and collecting their urine to make fertilizer could cut down on agricultural pollution, the team says.

    Lindsay Whistance, a livestock researcher at the Organic Research Centre in Cirencester, England, is “not surprised by the results.” With proper training and motivation, “I fully expected cattle to be able to learn this task,” says Whistance, who was not involved in the study. The practicality of potty training cows on a large scale, she says, is another matter.

    For MooLoo training to become a widespread practice, “it has to be automated,” Matthews says. “We want to develop automated training systems, automated reward systems.” Those systems are still far from reality, but Matthews and colleagues hope they could have big impacts. If 80 percent of cow pee were collected in latrines, for instance, that could cut associated ammonia emissions in half, previous research suggests.

    “It’s those ammonia emissions that are key to the real environmental benefit, as well as potential for reducing water contamination,” says Jason Hill, a biosystems engineer at the University of Minnesota in St. Paul not involved in the work. “Ammonia from cattle is a major contributor to reduced human health,” he says (SN: 1/16/09). So potty training cattle could help create cleaner air — as well as a cleaner, more comfortable living space for cows themselves. More

  • in

    Engineers create 3D-printed objects that sense how a user is interacting with them

    MIT researchers have developed a new method to 3D print mechanisms that detect how force is being applied to an object. The structures are made from a single piece of material, so they can be rapidly prototyped. A designer could use this method to 3D print “interactive input devices,” like a joystick, switch, or handheld controller, in one go.
    To accomplish this, the researchers integrated electrodes into structures made from metamaterials, which are materials divided into a grid of repeating cells. They also created editing software that helps users build these interactive devices.
    “Metamaterials can support different mechanical functionalities. But if we create a metamaterial door handle, can we also know that the door handle is being rotated, and if so, by how many degrees? If you have special sensing requirements, our work enables you to customize a mechanism to meet your needs,” says co-lead author Jun Gong, a former visiting PhD student at MIT who is now a research scientist at Apple.
    Gong wrote the paper alongside fellow lead authors Olivia Seow, a graduate student in the MIT Department of Electrical Engineering and Computer Science (EECS), and Cedric Honnet, a research assistant in the MIT Media Lab. Other co-authors are MIT graduate student Jack Forman and senior author Stefanie Mueller, who is an associate professor in EECS and a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL). The research will be presented at the Association for Computing Machinery Symposium on User Interface Software and Technology next month.
    “What I find most exciting about the project is the capability to integrate sensing directly into the material structure of objects. This will enable new intelligent environments in which our objects can sense each interaction with them,” Mueller says. “For instance, a chair or couch made from our smart material could detect the user’s body when the user sits on it and either use it to query particular functions (such as turning on the light or TV) or to collect data for later analysis (such as detecting and correcting body posture).”
    Embedded electrodes
    Because metamaterials are made from a grid of cells, when the user applies force to a metamaterial object, some of the flexible, interior cells stretch or compress. More

  • in

    Scientists can now assemble entire genomes on their personal computers in minutes

    Scientists at the Massachusetts Institute of Technology (MIT) and the Institut Pasteur in France have developed a technique for reconstructing whole genomes, including the human genome, on a personal computer. This technique is about a hundred times faster than current state-of-the-art approaches and uses one-fifth the resources. The study, published September 14 in the journal Cell Systems, allows for a more compact representation of genome data inspired by the way in which words, rather than letters, offer condensed building blocks for language models.
    “We can quickly assemble entire genomes and metagenomes, including microbial genomes, on a modest laptop computer,” says Bonnie Berger (@lab_berger), the Simons Professor of Mathematics at the Computer Science and AI Lab at MIT and an author of the study. “This ability is essential in assessing changes in the gut microbiome linked to disease and bacterial infections, such as sepsis, so that we can more rapidly treat them and save lives.”
    Genome assembly projects have come a long way since the Human Genome Project, which finished assembling the first complete human genome in 2003 for the cost of about $2.7 billion and more than a decade of international collaboration. But while human genome assembly projects no longer take years, they still require several days and massive computer power. Third-generation sequencing technologies offer terabytes of high-quality genomic sequences with tens of thousands of base pairs, yet genome assembly using such an immense quantity of data has proved challenging.
    To approach genome assembly more efficiently than current techniques, which involve making pairwise comparisons between all possible pairs of reads, Berger and colleagues turned to language models. Building from the concept of a de Bruijn graph, a simple, efficient data structure used for genome assembly, the researchers developed a minimizer-space de Bruin graph (mdBG), which uses short sequences of nucleotides called minimizers instead of single nucleotides.
    “Our minimizer-space de Bruijn graphs store only a small fraction of the total nucleotides, while preserving the overall genome structure, enabling them to be orders of magnitude more efficient than classical de Bruijn graphs,” says Berger.
    The researchers applied their method to assemble real HiFi data (which has almost perfect single-molecule read accuracy) for Drosophila melanogaster fruit flies, as well as human genome data provided by Pacific Biosciences (PacBio). When they evaluated the resulting genomes, Berger and colleagues found that their mdBG-based software required about 33 times less time and 8 times less random-access memory (RAM) computing hardware than other genome assemblers. Their software performed genome assembly for the HiFi human data 81 times faster with 18 times less memory usage than the Peregrine assembler and 338 times faster with 19 times less memory usage than the hifiasm assembler. More

  • in

    New ocean temperature data help scientists make their hot predictions

    We’ve heard that rising temperatures will lead to rising sea levels, but what many may not realise is that most of the increase in energy in the climate system is occurring in the ocean.
    Now a study from UNSW Sydney and CSIRO researchers has shown that a relatively new ocean temperature measuring program — the Argo system of profiling floats — can help tell us which climate modelling for the 21st century we should be paying attention to the most.
    Professor John Church from UNSW’s Climate Change Research Centre in the School of Biological, Earth and Environmental Sciences says the study published today in Nature Climate Change is an attempt to narrow the projected range of future ocean temperature rises to the end of the 21st century using model simulations that are most consistent with the Argo’s findings in the years 2005 to 2019.
    “The models that projected very high absorption of heat by the ocean by 2100 also have unrealistically high ocean absorption over the Argo period of measurement,” Prof. Church says.
    “Likewise, there are models with lower heat absorption in the future that also don’t correspond to the Argo data. So we have effectively used the Argo observations to say, ‘which of these models best agree with the observations and therefore constrain projections for the future?'”
    Named after the boat which Greek mythological hero Jason travelled on in search of the golden fleece, the Argo floats are loaded with high-tech equipment that measures ocean temperatures to depths of up to 2000 metres. More