More stories

  • in

    Ultrafast 'camera' captures hidden behavior of potential 'neuromorphic' material

    Imagine a computer that can think as fast as the human brain while using very little energy. That’s the goal of scientists seeking to discover or develop materials that can send and process signals as easily as the brain’s neurons and synapses. Identifying quantum materials with an intrinsic ability to switch between two distinct forms (or more) may hold the key to these futuristic sounding “neuromorphic” computing technologies.
    In a paper just published in the journal Physical Review X, Yimei Zhu, a physicist at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory, and his collaborators describe surprising new details about vanadium dioxide, one of the most promising neuromorphic materials. Using data collected by a unique “stroboscopic camera,” the team captured the hidden trajectory of atomic motion as this material transitions from an insulator to a metal in response to a pulse of light. Their findings could help guide the rational design of high-speed and energy-efficient neuromorphic devices.
    “One way to reduce energy consumption in artificial neurons and synapses for brain-inspired computing is to exploit the pronounced non-linear properties of quantum materials,” said Zhu. “The principal idea behind this energy efficiency is that, in quantum materials, a small electrical stimulus may produce a large response that can be electrical, mechanical, optical, or magnetic through a change of material state.”
    “Vanadium dioxide is one of the rare, amazing materials that has emerged as a promising candidate for neuro-mimetic bio-inspired devices,” he said. It exhibits an insulator-metal transition near room temperature in which a small voltage or current can produce a large change in resistivity with switching that can mimic the behavior of both neurons (nerve cells) and synapses (the connections between them).
    “It goes from completely insulating, like rubber, to a very good metal conductor, with a resistivity change of 10,000 times or more,” Zhu said.
    Those two very different physical states, intrinsic in the same material, could be encoded for cognitive computing. More

  • in

    'Self-driving' microscopes discover shortcuts to new materials

    Researchers at the Department of Energy’s Oak Ridge National Laboratory are teaching microscopes to drive discoveries with an intuitive algorithm, developed at the lab’s Center for Nanophase Materials Sciences, that could guide breakthroughs in new materials for energy technologies, sensing and computing.
    “There are so many potential materials, some of which we cannot study at all with conventional tools, that need more efficient and systematic approaches to design and synthesize,” said Maxim Ziatdinov of ORNL’s Computational Sciences and Engineering Division and the CNMS. “We can use smart automation to access unexplored materials as well as create a shareable, reproducible path to discoveries that have not previously been possible.”
    The approach, published in Nature Machine Intelligence, combines physics and machine learning to automate microscopy experiments designed to study materials’ functional properties at the nanoscale.
    Functional materials are responsive to stimuli such as heat or electricity and are engineered to support both everyday and emerging technologies, ranging from computers and solar cells to artificial muscles and shape-memory materials. Their unique properties are tied to atomic structures and microstructures that can be observed with advanced microscopy. However, the challenge has been to develop efficient ways to locate regions of interest where these properties emerge and can be investigated.
    Scanning probe microscopy is an essential tool for exploring the structure-property relationships in functional materials. Instruments scan the surface of materials with an atomically sharp probe to map out the structure at the nanometer scale — the length of one billionth of a meter. They can also detect responses to a range of stimuli, providing insights into fundamental mechanisms of polarization switching, electrochemical reactivity, plastic deformation or quantum phenomena. Today’s microscopes can perform a point-by-point scan of a nanometer square grid, but the process can be painstakingly slow, with measurements collected over days for a single material.
    “The interesting physical phenomena are often only manifested in a small number of spatial locations and tied to specific but unknown structural elements. While we typically have an idea of what will be the characteristic features of physical phenomena we aim to discover, pinpointing these regions of interest efficiently is a major bottleneck,” said former ORNL CNMS scientist and lead author Sergei Kalinin, now at the University of Tennessee, Knoxville. “Our goal is to teach microscopes to seek regions with interesting physics actively and in a manner much more efficient than performing a grid search.”
    Scientists have turned to machine learning and artificial intelligence to overcome this challenge, but conventional algorithms require large, human-coded datasets and may not save time in the end. More

  • in

    Development of an ensemble model to anticipate short-term COVID-19 hospital demand

    For the past two years, the COVID-19 pandemic has exerted pressure on the hospital system, with consequences for patients’ care pathways. To support hospital planning strategies, it is important to anticipate COVID-19 health care demand and to continue to improve predictive models.
    In this study published in the Proceedings of the National Academy of Sciences, scientists from the Mathematical Modeling of Infectious Diseases Unit at the Institut Pasteur identified the most relevant predictive variables for anticipating hospital demand and proposed using an ensemble model based on the average of the predictions of several individual models.
    The scientists began by evaluating the performance of 12 individual models and 19 predictive variables, or “predictors,” such as epidemiological data (for example the number of cases) and meteorological or mobility data (for example the use of public transport). The scientists showed that the models incorporating these early predictive variables performed better. The average prediction error was halved for 14-day-ahead predictions. “These early variables detect changes in epidemic dynamics more quickly,” explains Simon Cauchemez, Head of the Mathematical Modeling of Infectious Diseases Unit at the Institut Pasteur and last author of the study. “The models that performed best used at least one epidemiological predictor and one mobility predictor,” he continues. The addition of a meteorological variable also improved forecasts but with a more limited impact.
    The scientists then built an ensemble model, taking the average of several individual models, and tested the model retrospectively using epidemiological data from March to July 2021. This approach is already used in climate forecasting. “Our study shows that it is preferable to develop an ensemble model, as this reduces the risk of the predicted trajectory being overly influenced by the assumptions of a specific model,” explains Juliette Paireau, a research engineer in the Mathematical Modeling of Infectious Diseases Unit at the Institut Pasteur and joint first author of the study.
    This ensemble model has been used to monitor the epidemic in France since January 15, 2021.
    The study demonstrates an approach that can be used to better anticipate hospital demand for COVID-19 patients by combining different prediction models based on early predictors.
    The full results of the study can be found on the Modeling page : https://modelisation-covid19.pasteur.fr/realtime-analysis/hospital/
    Story Source:
    Materials provided by Institut Pasteur. Note: Content may be edited for style and length. More

  • in

    These six foods may become more popular as the planet warms

    No matter how you slice it, climate change will alter what we eat in the future. Today, just 13 crops provide 80 percent of people’s energy intake worldwide, and about half of our calories come from wheat, maize and rice. Yet some of these crops may not grow well in the higher temperatures, unpredictable rainfall and extreme weather events caused by climate change. Already, drought, heat waves and flash floods are damaging crops around the world.

    “We must diversify our food basket,” says Festo Massawe. He’s executive director of Future Food Beacon Malaysia, a group at the University of Nottingham Malaysia campus in Semenyih that studies the impact of climate change on food security.

    That goes beyond what we eat to how we grow it. The trick will be investing in every possible solution: breeding crops so they’re more climate resilient, genetically engineering foods in the lab and studying crops that we just don’t know enough about, says ecologist Samuel Pironon of the Royal Botanic Gardens, Kew in London. To feed a growing population in a rapidly changing world, food scientists are exploring many possible avenues, while thinking about how to be environmentally friendly.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    Consumer preferences are part of the equation as well. “It does have to be that right combination of: It looks good, it tastes good and it’s the right price point,” says Halley Froehlich, an aquaculture and fisheries scientist at the University of California, Santa Barbara.

    Here are six foods that could check all those boxes and feature more prominently on menus and grocery shelves in the future.

    1. Millet

    SZJPHOTO/MOMENT/GETTY IMAGES

    Source of: Carbohydrates, protein, minerals (potassium, phosphorus and magnesium)Uses: Whole grain; gluten-free flour, pasta, chips, beer

    The United Nations has declared 2023 the International Year of Millets (a handful of varieties exist). Quinoa earned the same honor in 2013, and its sales skyrocketed. First cultivated in Asia some 10,000 years ago, millet is a staple grain in parts of Asia and Africa. Compared with wheat, maize and rice, millet is much more climate resilient; the crop needs little water and thrives in warmer, drier environments. Some more good news: Millet is one of many ancient grains — including teff, amaranth and sorghum — that are similarly sustainable and resilient (not to mention capable of being turned into beer).

    2. Bambara groundnut

    PONSULAK/ISTOCK/GETTY IMAGES PLUS

    Source of: Protein, fiber, minerals (potassium, magnesium and iron)Uses: Roasted or boiled; gluten-free flour; dairy-free milk

    You’ve heard of almond milk and soy milk. The next alternative at your coffee shop could be made from Bambara groundnuts, a drought-tolerant legume native to sub-Saharan Africa. Like other legumes, the Bambara groundnut is packed with protein. And bacteria on the plant convert atmospheric nitrogen into ammonia so the groundnut grows well in nutrient-poor soil without chemical fertilizers. A better understanding of the plant, says Festo Massawe of Future Food Beacon Malaysia, could pave the way for breeding programs to help the Bambara groundnut become as popular as the soybean, a legume that produces high yields but is less drought tolerant.

    3. Mussels

    MATT MACRO/EYEEM/GETTY IMAGES

    Source of: Protein, omega-3, vitamin B12, minerals (iron, manganese and zinc)Uses: Steamed; added to pasta dishes, stews, soups

    A delicious mussel linguine might someday become a weeknight regular on the family menu. Mussels and other bivalves, including oysters, clams and scallops, could make up about 40 percent of seafood by 2050, according to a 2020 report in Nature. With no need to be watered or fertilized, bivalve farms are prime for scaling up, which would lower prices for consumers. All bivalves have merit, but Halley Froehlich of UC Santa Barbara singles out mussels as “super hardy,” “super nutritious” and underhyped. One downside: Shell-forming creatures are threatened as rising carbon levels boost ocean acidification. Kelp might be able to help.

    4. Kelp

    MOAIMAGE/MOMENT/GETTY IMAGES

    Source of: Vitamins, minerals (iodine, calcium and iron), antioxidantsUses: Salads, smoothies, salsa, pickles, noodles and chips; also found in toothpaste, shampoo and biofuels

    Kelp has a few cool climate-friendly tricks. For one, by taking in carbon dioxide during photosynthesis, it can lower the acidity of its watery surroundings. Farmers in Maine and Alaska grow kelp and bivalves together so that the shelled critters can benefit from the less acidic water. Kelp also sequesters carbon, like underwater trees. That means growing and eating more kelp could be good for the environment. While kelp and other seaweeds have been widely consumed in Asia for thousands of years, they’re still an acquired taste in many Western countries.

    5. Enset

    MIKE GOLDWATER/ALAMY STOCK PHOTO

    Source of: Carbohydrates, calcium, potassium and zincUses: Porridge or bread; also used to make rope, plates and building materials

    The drought-tolerant enset, cultivated in Ethiopia, is nicknamed the “false banana” because the plant resembles a banana tree, though its fruit is inedible. It’s also called “the tree against hunger” because its starchy stems can be harvested at any time of year, making it a reliable buffer food crop during dry periods. A 2021 report in Environmental Research Letters suggests that the enset’s range could be expanded to other parts of Africa, and possibly beyond. The processing required to make enset edible is complex, says study author James Borrell of the Royal Botanic Gardens, Kew. So any expansion would have to be led by the communities who hold that Indigenous knowledge.

    6. Cassava

    ILTONROGERIO/ISTOCK/GETTY IMAGES PLUS

    Source of: Carbohydrates, potassium, vitamin CUses: Whole cooked root; gluten-free flour; tapioca pearls in bubble tea

    Cassava, a starchy root vegetable from South America, checks the boxes for climate resilience, sustainability and nutrition. Now grown in over 100 countries, cassava can withstand temperatures of up to 40° Celsius and is salt and drought tolerant. An added plus: Higher atmospheric CO2 levels enhance the plant’s tolerance to stress and can lead to higher yields. Raw cassava can contain toxic levels of cyanide, but the chemical can be removed by peeling, soaking and cooking the root. More

  • in

    New computational tool to interpret clinical significance of cancer mutations

    Researchers at Children’s Hospital of Philadelphia (CHOP) have developed a new tool to help researchers interpret the clinical significance of somatic mutations in cancer. The tool, known as CancerVar, incorporates machine learning frameworks to go beyond merely identifying somatic cancer mutations and interpret the potential significance of those mutations in terms of cancer diagnosis, prognosis, and targetability. A paper describing CancerVar was published today in Science Advances.
    “CancerVar will not replace human interpretation in a clinical setting, but it will significantly reduce the manual work of human reviewers in classifying variants identified through sequencing and drafting clinical reports in the practice of precision oncology,” said Kai Wang, PhD, Professor of Pathology and Laboratory Medicine at CHOP and senior author of the paper. “CancerVar documents and harmonizes various types of clinical evidence including drug information, publications, and pathways for somatic mutations in detail. By providing standardized, reproducible, and precise output for interpreting somatic variants, CancerVar can help researchers and clinicians prioritize mutations of concern.”
    “Somatic variant classification and interpretation are the most time-consuming steps of tumor genomic profiling,” said Marilyn M. Li, MD, Professor of Pathology and Laboratory Medicine, Director of Cancer Genomic Diagnostics and co-author of the paper. “CancerVar provides a powerful tool that automates these two critical steps. Clinical implementation of this tool will significantly improve test turnaround time and performance consistency, making the tests more impactful and affordable to all pediatric cancer patients.”
    The growth of next-generation sequencing (NGS) and precision medicine has led to the identification of millions of somatic cancer variants. To better understand whether those mutations are related to or impact the clinical course of disease, researchers have established several databases that catalogue these variants. However, those databases did not provide standardized interpretations of somatic variants, so in 2017, the Association for Molecular Pathology (AMP), American Society of Clinical Oncology (ASCO), and College of American Pathologists (CAP) jointly proposed standards and guidelines for interpreting, reporting, and scoring somatic variants.
    Yet even with these guidelines, the AMP/ASCO/CAP classification scheme did not specify how to implement these standards, so different knowledge bases were providing different results. To solve this problem, the CHOP researchers, including CHOP data scientist and co-senior author of the paper Yunyun Zhou, PhD, developed CancerVar, an improved somatic variant interpretation tool using command-line software called Python with an accompanying web server. With a user-friendly web server, CancerVar includes clinical evidence for 13 million somatic cancer variants from 1,911 cancer census genes that were mined through existing studies and databases.
    In addition to including millions of somatic mutations, whether of known significance or not, the tool uses deep learning to improve clinical interpretation of those mutations. Users can query clinical interpretations for variants using information such as the chromosome position or protein change and interactively fine-tune how specific scoring features are weighted, based on prior knowledge or additional user-specified criteria. The CancerVar web server generates automated descriptive interpretations, such as whether the mutation is relevant for diagnosis or prognosis or to an ongoing clinical trial.
    “This tool shows how we can use computational tools to automate human generated guidelines, and also how machine learning can guide decision making,” Wang said. “Future research should explore applying this framework to other areas of pathology as well.”
    The research was supported by the National Institutes of Health (NIH)/National Library of Medicine (NLM)/ National Human Genome Research Institute (NHGRI) (grant number LM012895), NIH/National Institute of General Medical Sciences (NIGMS) (grant number GM120609 and GM132713), CHOP Pathology diagnostic innovation fund, and the CHOP Research Institute.
    Story Source:
    Materials provided by Children’s Hospital of Philadelphia. Note: Content may be edited for style and length. More

  • in

    Powerful family of two-dimensional materials discovered

    A team from the Tulane University School of Science and Engineering has developed a new family of two-dimensional materials that researchers say has promising applications, including in advanced electronics and high-capacity batteries.
    Led by Michael Naguib, an assistant professor in the Department of Physics and Engineering Physics, the study has been published in the journal Advanced Materials.
    “Two-dimensional materials are nanomaterials with thickness in the nanometer size (nanometer is one millionth of a millimeter) and lateral dimensions thousands of times the thickness,” Naguib said. “Their flatness offers unique set of properties compared to bulk materials.”
    The name of the new family of 2D materials is transition metal carbo-chalcogenides, or TMCC. It combines the characteristics of two families of 2D materials — transition metal carbides and transition metal dichalcogenides.
    Naguib, the Ken & Ruth Arnold Early Career Professor in Science and Engineering, said the latter is a large family of materials that has been explored extensively and found to be very promising, especially for electrochemical energy storage and conversion. But he said one of the challenges in utilizing them is their low electrical conductivity and stability.
    On the other hand, he said, transition metal carbides are excellent electrical conductors with much more powerful conductivity. Merging the two families into one is anticipated to have great potential for many applications such as batteries and supercapacitors, catalysis, sensors and electronics.
    “Instead of stacking the two different materials like Lego building blocks with many problematic interfaces, here we develop a new 2D material that has the combination of both compositions without any interface,” he said.
    “We used an electrochemical-assisted exfoliation process by inserting lithium ions in-between the layers of bulk transition metals carbo-chalcogenides followed by agitation in water,” said Ahmad Majed, the first author of the article and a doctoral candidate in Materials Physics and Engineering at Tulane working in Naguib’s group.
    Unlike other exotic nanomaterials, Majed said, the process of making these 2D TMCC nanomaterials is simple and scalable.
    In addition to Naguib and Majed, the team includes Jiang Wei, an associate professor in physics and engineering physics; Jianwei Sun, an assistant professor in physics and engineering physics; PhD candidates Kaitlyn Prenger, Manish Kothakonda and Fei Wang at Tulane; and Dr Eric N. Tseng and professor Per O.A. Persson of Linkoping University in Sweden.
    This study was supported by Naguib’s National Science Foundation Career Award that he received less than a year ago.
    Story Source:
    Materials provided by Tulane University. Note: Content may be edited for style and length. More

  • in

    It takes three to tangle: Long-range quantum entanglement needs three-way interaction

    A theoretical study shows that long-range entanglement can indeed survive at temperatures above absolute zero, if the correct conditions are met.
    Quantum computing has been earmarked as the next revolutionary step in computing. However current systems are only practically stable at temperatures close to absolute zero. A new theorem from a Japanese research collaboration provides an understanding of what types of long-range quantum entanglement survive at non-zero temperatures, revealing a fundamental aspect of macroscopic quantum phenomena and guiding the way towards further understanding of quantum systems and designing new room-temperature stable quantum devices.
    When things get small, right down to the scale of one-thousandth the width of a human hair, the laws of classical physics get replaced by those of quantum physics. The quantum world is weird and wonderful, and there is much about it that scientists are yet to understand. Large-scale or “macroscopic” quantum effects play a key role in extraordinary phenomena such as superconductivity, which is a potential game-changer in future energy transport, as well for the continued development of quantum computers.
    It is possible to observe and measure “quantumness” at this scale in particular systems with the help of long-range quantum entanglement. Quantum entanglement, which Albert Einstein once famously described as “spooky action at a distance,” occurs when a group of particles cannot be described independently from each other. This means that their properties are linked: if you can fully describe one particle, you will also know everything about the particles it is entangled with.
    Long-range entanglement is central to quantum information theory, and its further understanding could lead to a breakthrough in quantum computing technologies. However, long-range quantum entanglement is stable at specific conditions, such as between three or more parties and at temperatures close to absolute zero (-273°C). What happens to two-party entangled systems at non-zero temperatures? To answer this question, researchers from the RIKEN Center for Advanced Intelligence Project, Tokyo, and Keio University, Yokohama, recently presented a theoretical study in Physical Review X describing long-range entanglement at temperatures above absolute zero in bipartite systems.
    “The purpose of our study was to identify a limitation on the structure of long-range entanglement at arbitrary non-zero temperatures,” explains RIKEN Hakubi Team Leader Tomotaka Kuwahara, one of the authors of the study, who performed the research while at the RIKEN Center for Advanced Intelligence Project. “We provide simple no-go theorems that show what kinds of long-range entanglement can survive at non-zero temperatures. At temperatures above absolute zero, particles in a material vibrate and move around due to thermal energy, which acts against quantum entanglement. At arbitrary non-zero temperatures, no long-range entanglement can persist between only two subsystems.”
    The researchers’ findings are consistent with previous observations that long-range entanglement survives at a non-zero temperature only when more than three subsystems are involved. The results suggest this is a fundamental aspect of macroscopic quantum phenomena at room temperatures, and that quantum devices need to be engineered to have multipartite entangled states.
    “This result has opened the door to a deeper understanding of quantum entanglement over large distances, so this is just the beginning.,” states Keio University’s Professor Keijo Saito, the co-author of the study. “We aim to deepen our understanding of the relationship between quantum entanglement and temperature in the future. This knowledge will spark and drive the development of future quantum devices that work at room temperatures, making them practical.”
    Story Source:
    Materials provided by RIKEN. Note: Content may be edited for style and length. More

  • in

    In balance: Quantum computing needs the right combination of order and disorder

    Research conducted within the Cluster of Excellence ‘Matter and Light for Quantum Computing’ (ML4Q) has analysed cutting-edge device structures of quantum computers to demonstrate that some of them are indeed operating dangerously close to a threshold of chaotic meltdown. The challenge is to walk a thin line between too high, but also too low disorder to safeguard device operation. The study ‘Transmon platform for quantum computing challenged by chaotic fluctuations’ has been published today in Nature Communications.
    In the race for what may become a key future technology, tech giants like IBM and Google are investing enormous resources into the development of quantum computing hardware. However, current platforms are not yet ready for practical applications. There remain multiple challenges, among them the control of device imperfections (‘disorder’).
    It’s an old stability precaution: When large groups of people cross bridges, they need to avoid marching in step to prevent the formation of resonances destabilizing the construction. Perhaps counterintuitively, the superconducting transmon qubit processor — a technologically advanced platform for quantum computing favoured by IBM, Google, and other consortia — relies on the same principle: intentionally introduced disorder blocks the formation of resonant chaotic fluctuations, thus becoming an essential part of the production of multi-qubit processors.
    To understand this seemingly paradoxical point, one should think of a transmon qubit as a kind of pendulum. Qubits interlinked to form a computing structure define a system of coupled pendulums — a system that, like classical pendulums, can easily be excited to uncontrollably large oscillations with disastrous consequences. In the quantum world, such uncontrollable oscillations lead to the destruction of quantum information; the computer becomes unusable. Intentionally introduced local ‘detunings’ of single pendulums keep such phenomena at bay.
    ‘The transmon chip not only tolerates but actually requires effectively random qubit-to-qubit device imperfections,’ explained Christoph Berke, final-year doctoral student in the group of Simon Trebst at the University of Cologne and first author of the paper. ‘In our study, we ask just how reliable the “stability by randomness” principle is in practice. By applying state-of-the-art diagnostics of the theory of disordered systems, we were able to find that at least some of the industrially pursued system architectures are dangerously close to instability.’
    From the point of view of fundamental quantum physics, a transmon processor is a many-body quantum system with quantized energy levels. State-of-the-art numerical tools allow one to compute these discrete levels as a function of relevant system parameters, to obtain patterns superficially resembling a tangle of cooked spaghetti. A careful analysis of such structures for realistically modelled Google and IBM chips was one out of several diagnostic tools applied in the paper to map out a stability diagram for transmon quantum computing.
    ‘When we compared the Google to the IBM chips, we found that in the latter case qubit states may be coupled to a degree that controlled gate operations may be compromised,’ said Simon Trebst, head of the Computational Condensed Matter Physics group at the University of Cologne. In order to secure controlled gate operations, one thus needs to strike the subtle balance between stabilizing qubit integrity and enabling inter-qubit coupling. In the parlance of pasta preparation, one needs to prepare the quantum computer processor into perfection, keeping the energy states ‘al dente’ and avoiding their tangling by overcooking.
    The study of disorder in transmon hardware was performed as part of the Cluster of Excellence ML4Q in a collaborative work among the research groups of Simon Trebst and Alexander Altland at the University of Cologne and the group of David DiVincenzo at RWTH Aachen University and Forschungszentrum Jülich. “This collaborative project is quite unique,” says Alexander Altland from the Institute for Theoretical Physics in Cologne. “Our complementary knowledge of transmon hardware, numerical simulation of complex many-body systems, and quantum chaos was the perfect prerequisite to understand how quantum information with disorder can be protected. It also indicates how insights obtained for small reference systems can be transferred to application-relevant design scales.”
    David DiVincenzo, founding director of the JARA-Institute for Quantum Information at RWTH Aachen University, draws the following conclusion: ‘Our study demonstrates how important it is for hardware developers to combine device modelling with state-of-the-art quantum randomness methodology and to integrate “chaos diagnostics” as a routine part of qubit processor design in the superconducting platform.’
    Story Source:
    Materials provided by University of Cologne. Note: Content may be edited for style and length. More