More stories

  • in

    Rational neural network advances machine-human discovery

    Math is the language of the physical world, and Alex Townsend sees mathematical patterns everywhere: in weather, in the way soundwaves move, and even in the spots or stripes zebra fish develop in embryos.
    “Since Newton wrote down calculus, we have been deriving calculus equations called differential equations to model physical phenomena,” said Townsend, associate professor of mathematics in the College of Arts and Sciences.
    This way of deriving laws of calculus works, Townsend said, if you already know the physics of the system. But what about learning physical systems for which the physics remains unknown?
    In the new and growing field of partial differential equation (PDE) learning, mathematicians collect data from natural systems and then use trained computer neural networks in order to try to derive underlying mathematical equations. In a new paper, Townsend, together with co-authors Nicolas Boullé of the University of Oxford and Christopher Earls, professor of civil and environmental engineering in the College of Engineering, advance PDE learning with a novel “rational” neural network, which reveals its findings in a manner that mathematicians can understand: through Green’s functions — a right inverse of a differential equation in calculus.
    This machine-human partnership is a step toward the day when deep learning will enhance scientific exploration of natural phenomena such as weather systems, climate change, fluid dynamics, genetics and more. “Data-Driven Discovery of Green’s Functions With Human-Understandable Deep Learning” was published in Scientific Reports, Nature on March 22.
    A subset of machine learning, neural networks are inspired by the simple animal brain mechanism of neurons and synapses — inputs and outputs, Townsend said. Neurons — called “activation functions” in the context of computerized neural networks — collect inputs from other neurons. Between the neurons are synapses, called weights, that send signals to the next neuron. More

  • in

    Novel framework for classifying chaos and thermalization

    One popular example of chaotic behavior is the butterfly effect — a butterfly may flap its wings in somewhere in the Atlantic Ocean and cause a tornado in Colorado. This remarkable fable illustrates how the extreme sensitivity of the dynamics of chaotic systems can yield dramatically different results despite slight differences in initial conditions. The fundamental laws of nature governing the dynamics of physical systems are inherently nonlinear, often leading to chaos and subsequent thermalization.
    However one may ask why are there no rampant increase in tornadoes in Colorado caused by a massive disappointment of butterflies in global affairs, such as say global warming? This is because physical dynamics, although chaotic, are capable of demonstrating remarkably stable states. One example is the stability of our solar system — it obeys nonlinear laws of physics, which can seemingly induce chaos in the system.
    The reason for this stability relies on the fact that weakly chaotic systems may display very ordered periodic dynamics that can last for millions of years. This discovery was made in the 1950s by great mathematicians Kolmogorov, Arnold, and Moser. Their discovery, however, works only in the case of systems with a small number of interacting elements. If the system includes many constituent parts, then its fate is not that well understood.
    Researchers from the Center for Theoretical Physics of Complex Systems (PCS) within the Institute for Basic Science (IBS), South Korea have recently introduced a novel framework for characterizing weakly chaotic dynamics in complex systems containing a large number of constituent particles. To achieve this, they used a quantum computing-based model — Unitary Circuits Map — to simulate chaos.
    Investigating time scales of chaoticity is a challenging task, requiring efficient computational methods. The Unitary Circuit Map model implemented in this study addresses this requirement. “The model allows for efficient and error-free propagation of states in time,” Merab Malishava explains, “which is essential for modeling extremely weak chaoticity in large systems. Such models were used to achieve record-breaking nonlinear evolution times before, which was also done in our group.”
    As a result, they were able to classify the dynamics within the system by identifying time and length scales that emerges as thermalization dramatically slows down. The researchers found that if the constituent parts are connected in a long-range network (LRN) manner (for example in an all-to-all manner), then the thermalization dynamics are characterized by one unique time scale, called the Lyapunov time. However, if the coupling is of a short-range network (SRN) nature (for example nearest neighbor) then an additional length scale emerges related to the freezing of larger parts of the system over long times with rare chaotic splashes.
    Typically the studies on such sensitive dynamics are done using the techniques of analyzing the behavior of observables. These techniques date back to the 1950s when the first experiments on chaoticity and thermalization were performed. The authors identified a novel method of analysis — by investigating the Lyapunov spectrum scaling.
    Merab Malishava says: “Previous methods might result in ambiguous outcomes. You choose an observable and seemingly notice thermalization and think that the dynamics are chaotic. However if another observable is studied, from another perspective, then you conclude that the system is frozen and nothing changes, meaning no thermalization. This is the ambiguity, which we overcame. The Lyapunov spectrum is a set of timescales characterizing the dynamics fully and completely. And what’s more, it’s the same from every point of view! Unique, and unambiguous.”
    The results are not only interesting from a fundamental standpoint. They also have the potential to shed light on the realizations of quantum computers. Quantum computation requires coherent dynamics, which means no thermalization. In the current work, a dramatic slowdown of thermal dynamics was studied with emerging quasi-conserved quantities. Quantizing this case could possibly explain such phenomena as many-body localization, which is one of the basic ideas for avoiding thermalization in quantum computers.
    Another great accomplishment of the study relates to the applicability of the results to a vast majority of physical models ranging from simple oscillator networks to complex spin network dynamics. Dr. Sergej Flach, the leader of the research group and the director of PCS explains: “We have been working for five years on developing a framework to classify weakly chaotic dynamics in macroscopic systems, which resulted in a series of works significantly advancing the area. We put aside narrowly focused case-by-case studies in favor of fostering a conceptual approach that is reliable and relatable in a great number of physical realizations. This specific work is a highly important building block in the aforementioned framework. We found that a traditional way of looking at things is sometimes not the most informative and offered a novel alternative approach. Our work by no means stops here, as we look forward to advancing science with more breakthrough ideas.”
    This research was recently published in Physical Review Letters.
    Story Source:
    Materials provided by Institute for Basic Science. Note: Content may be edited for style and length. More

  • in

    New technique offers faster security for non-volatile memory tech

    Researchers have developed a technique that leverages hardware and software to improve file system security for next-generation memory technologies called non-volatile memories (NVMs). The new encryption technique also permits faster performance than existing software security technologies.
    “NVMs are an emerging technology that allows rapid access to the data, and retains data even when a system crashes or loses power,” says Amro Awad, senior author of a paper on the work and an assistant professor of electrical and computer engineering at North Carolina State University. “However, the features that give NVMs these attractive characteristics also make it difficult to encrypt files on NVM devices — which raises security concerns. We’ve developed a way to secure files on NVM devices without sacrificing the speed that makes NVMs attractive.”
    “Our technique allows for file-level encryption in fast NVM memories, while cutting the related execution time significantly,” says Kazi Abu Zubair, first author of the paper and a Ph.D. student at NC State.
    Traditionally, computers use two types of data storage. Dynamic random access memory (DRAM) allows quick access to stored data, but will lose that data if the system crashes. Long-term storage technologies, such as hard drives, are good at retaining data even if a system loses power — but store the data in a way that makes it slower to access.
    NVMs combine the best features of both technologies. However, securing files on NVM devices can be challenging.
    Existing methods for file system encryption use software, which is not particularly fast. Historically, this wasn’t a problem because the technologies for accessing file data from long-term storage devices weren’t particularly fast either. More

  • in

    Artificial intelligence may improve diabetes diagnosis, study shows

    Using a fully-automated artificial intelligence (AI) deep learning model, researchers were able to identify early signs of type 2 diabetes on abdominal CT scans, according to a new study published in the journal Radiology.
    Type 2 diabetes affects approximately 13% of all U.S. adults and an additional 34.5% of adults meet the criteria for prediabetes. Due to the slow onset of symptoms, it is important to diagnose the disease in its early stages. Some cases of pre-diabetes can last up to 8 years and an earlier diagnosis will allow patients to make lifestyle changes to alter the progression of the disease.
    Abdominal CT imaging can be a promising tool to diagnose type 2 diabetes. CT imaging is already widely used in clinical practices, and it can provide a significant amount of information about the pancreas. Previous studies have shown that patients with diabetes tend to accumulate more visceral fat and fat within the pancreas than non-diabetic patients. However, not much work has been done to study the liver, muscles and blood vessels around the pancreas, said study co-senior author Ronald M. Summers, M.D., Ph.D., senior investigator and staff radiologist at the National Institutes of Health Clinical Center in Bethesda, Maryland.
    “The analysis of both pancreatic and extra-pancreatic features is a novel approach and has not been shown in previous work to our knowledge,” said first author Hima Tallam, B.S.E., M.D./Ph.D. student.
    The manual analysis of low-dose non-contrast pancreatic CT images by a radiologist or trained specialist is a time-intensive and difficult process. To address these clinical challenges, there is a need for the improvement of automated image analysis of the pancreas, the authors said.
    For this retrospective study, Dr. Summers and colleagues, in close collaboration with co-senior author Perry J. Pickhardt, M.D., professor of radiology at the University of Wisconsin School of Medicine & Public Health, used a dataset of patients who had undergone routine colorectal cancer screening with CT at the University of Wisconsin Hospital and Clinics. Of the 8,992 patients who had been screened between 2004 and 2016, 572 had been diagnosed with type 2 diabetes and 1,880 with dysglycemia, a term that refers to blood sugar levels that go too low or too high. There was no overlap between diabetes and dysglycemic diagnosis.
    To build the deep learning model, the researchers used a total of 471 images obtained from a variety of datasets, including the Medical Data Decathlon, The Cancer Imaging Archive and the Beyond Cranial Vault challenge. The 471 images were then divided into three subsets: 424 for training, 8 for validation and 39 for test sets. Researchers also included data from four rounds of active learning.
    The deep learning model displayed excellent results, demonstrating virtually no difference compared to manual analysis. In addition to the various pancreatic features, the model also analyzed the visceral fat, density and volumes of the surrounding abdominal muscles and organs.
    The results showed that patients with diabetes had lower pancreas density and higher visceral fat amounts than patients without diabetes.
    “We found that diabetes was associated with the amount of fat within the pancreas and inside the patients’ abdomens,” Dr. Summers said. “The more fat in those two locations, the more likely the patients were to have diabetes for a longer period of time.”
    The best predictors of type 2 diabetes in the final model included intrapancreatic fat percentage, pancreas fractal dimension, plaque severity between the L1-L4 vertebra level, average liver CT attenuation, and BMI. The deep learning model used these predictors to accurately discern patients with and without diabetes.
    “This study is a step towards the wider use of automated methods to address clinical challenges,” the authors said. “It may also inform future work investigating the reason for pancreatic changes that occur in patients with diabetes.” More

  • in

    Adding AI to Museum exhibits increases learning, keeps kids engaged longer

    Hands-on exhibits are staples of science and children’s museums around the world, and kids love them. The exhibits invite children to explore scientific concepts in fun and playful ways.
    But do kids actually learn from them? Ideally, museum staff, parents or caregivers are on hand to help guide the children through the exhibits and facilitate learning, but that is not always possible.
    Researchers from Carnegie Mellon University’s Human-Computer Interaction Institute (HCII) have demonstrated a more effective way to support learning and increase engagement. They used artificial intelligence to create a new genre of interactive, hands-on exhibits that includes an intelligent, virtual assistant to interact with visitors.
    When the researchers compared their intelligent exhibit to a traditional one, they found that the intelligent exhibit increased learning and the time spent at the exhibit.
    “Having artificial intelligence and computer vision turned the play into learning,” said Nesra Yannier, HCII faculty member and head of the project, who called the results “purposeful play.”
    Earthquake tables are popular exhibits. In a typical example, kids build towers and then watch them tumble on a shaking table. Signs around the exhibit try to engage kids in thinking about science as they play, but it is not clear how well these work or how often they are even read. More

  • in

    Scientists develop a recyclable pollen-based paper for repeated printing and ‘unprinting’

    Scientists at Nanyang Technological University, Singapore (NTU Singapore) have developed a pollen-based ‘paper’ that, after being printed on, can be ‘erased’ and reused multiple times without any damage to the paper.
    In a research paper published online in Advanced Materials on 5 April, the NTU Singapore scientists demonstrated how high-resolution colour images could be printed on the non-allergenic pollen paper with a laser printer, and then ‘unprinted’ — by completely removing the toner without damaging the paper — with an alkaline solution. They demonstrated that this process could be repeated up to at least eight times.
    This innovative, printer-ready pollen paper could become an eco-friendly alternative to conventional paper, which is made via a multi-step process with a significant negative environmental impact, said the NTU team led by Professors Subra Suresh and Cho Nam-Joon.
    It could also help to reduce the carbon emissions and energy usage associated with conventional paper recycling, which involves repulping, de-toning (removal of printer toner) and reconstruction.
    The other members of this all-NTU research team are research fellow Dr Ze Zhao, graduate students Jingyu Deng and Hyunhyuk Tae, and former graduate student Mohammed Shahrudin Ibrahim.
    Prof Subra Suresh, NTU President and senior author of the paper, said: “Through this study, we showed that we could print high-resolution colour images on paper produced from a natural, plant-based material that was rendered non-allergenic through a process we recently developed. We further demonstrated the feasibility of doing so repeatedly without destroying the paper, making this material a viable eco-friendly alternative to conventional wood-based paper. This is a new approach to paper recycling — not just by making paper in a more sustainable way, but also by extending the lifespan of the paper so that we get the maximum value out of each piece of paper we produce. The concepts established here, with further developments in scalable manufacturing, could be adapted and extended to produce other “directly printable” paper-based products such as storage and shipping cartons and containers.”
    Prof Cho Nam-Joon, senior author of the paper, said: “Aside from being easily recyclable, our pollen-based paper is also highly versatile. Unlike wood-based conventional paper, pollen is generated in large amounts and is naturally renewable, making it potentially an attractive raw material in terms of scalability, economics, and environmental sustainability. In addition, by integrating conductive materials with the pollen paper, we could potentially use the material in soft electronics, green sensors, and generators to achieve advanced functions and properties.” More

  • in

    Honey holds potential for making brain-like computer chips

    Honey might be a sweet solution for developing environmentally friendly components for neuromorphic computers, systems designed to mimic the neurons and synapses found in the human brain. Hailed by some as the future of computing, neuromorphic systems are much faster and use much less power than traditional computers. Engineers have demonstrated one way to make them more organic too by using honey to make a memristor, a component similar to a transistor that can not only process but also store data in memory. VANCOUVER, Wash. — Honey might be a sweet solution for developing environmentally friendly components for neuromorphic computers, systems designed to mimic the neurons and synapses found in the human brain.
    Hailed by some as the future of computing, neuromorphic systems are much faster and use much less power than traditional computers. Washington State University engineers have demonstrated one way to make them more organic too. In a study published in Journal of Physics D, the researchers show that honey can be used to make a memristor, a component similar to a transistor that can not only process but also store data in memory.
    “This is a very small device with a simple structure, but it has very similar functionalities to a human neuron,” said Feng Zhao, associate professor of WSU’s School of Engineering and Computer Science and corresponding author on the study.”This means if we can integrate millions or billions of these honey memristors together, then they can be made into a neuromorphic system that functions much like a human brain.”
    For the study, Zhao and first author Brandon Sueoka, a WSU graduate student in Zhao’s lab, created memristors by processing honey into a solid form and sandwiching it between two metal electrodes, making a structure similar to a human synapse. They then tested the honey memristors’ ability to mimic the work of synapses with high switching on and off speeds of 100 and 500 nanoseconds respectively. The memristors also emulated the synapse functions known as spike-timing dependent plasticity and spike-rate dependent plasticity, which are responsible for learning processes in human brains and retaining new information in neurons.
    The WSU engineers created the honey memristors on a micro-scale, so they are about the size of a human hair. The research team led by Zhao plans to develop them on a nanoscale, about 1/1000 of a human hair, and bundle many millions or even billions together to make a full neuromorphic computing system.
    Currently, conventional computer systems are based on what’s called the von Neumann architecture. Named after its creator, this architecture involves an input, usually from a keyboard and mouse, and an output, such as the monitor. It also has a CPU, or central processing unit, and RAM, or memory storage. Transferring data through all these mechanisms from input to processing to memory to output takes a lot of power at least compared to the human brain, Zhao said. For instance, the Fugaku supercomputer uses upwards of 28 megawatts, roughly equivalent to 28 million watts, to run while the brain uses only around 10 to 20 watts. More

  • in

    A UN report says stopping climate change is possible but action is needed now

    It doesn’t have to be this way. 

    The world already has the know-how and tools to dramatically reduce emissions from fossil fuels — but we need to use those tools immediately if we hope to forestall the worst impacts of climate change. That’s the message of the third and final installment of the massive sixth assessment of climate science by the United Nations’ Intergovernmental Panel on Climate Change, which was released April 4.

    “We know what to do, we know how to do it, and now it’s up to us to take action,” said sustainable energy researcher Jim Skea of Imperial College London, who cochaired the report, at a news event announcing its release. 

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    Earth is on track to warm by an average of about 3.2 degrees Celsius above preindustrial levels by the end of the century (SN: 11/26/19). Altering that course and limiting warming to 1.5 degrees or even 2 degrees means that global fossil fuel emissions will need to peak no later than the year 2025, the new report states. 

    Right now, meeting that goal looks extremely unlikely. National pledges to reduce fossil fuel emissions to date amount to “a litany of broken climate promises,” said United Nations Secretary-General António Guterres at the event. 

    The previous two installments of the IPCC’s sixth assessment described how climate change is already fueling extreme weather events around the globe — and noted that adaptation alone will not be enough to shield people from those hazards (SN: 8/9/21; SN: 2/28/22).

    The looming climate crisis “is horrifying, and I don’t want to sugarcoat that,” says Bronson Griscom, a forest ecologist and the director of Natural Climate Solutions at the environmental organization Conservation International, based in Arlington, Va. 

    But Griscom, who was not an author on the new IPCC report, says its findings also give him hope. It’s “what I would call a double-or-nothing bet that we’re confronted with right now,” he says. “There [are] multiple ways that this report is basically saying, ‘Look, if we don’t do anything, it’s increasingly grim.’ But the reasons to do something are incredibly powerful and the tools in the toolbox are very powerful.”

    Tools in the toolbox

    Those tools are strategies that governments, industries and individuals can use to cut emissions immediately in multiple sectors of the global economy, including transportation, energy, building, agriculture and forestry, and urban development. Taking immediate advantage of opportunities to reduce emissions in each of those sectors would halve global emissions by 2030, the report states. 

    Consider the transportation sector, which contributed 15 percent of human-related greenhouse gas emissions in 2019. Globally, electric vehicle sales have surged in the last few years, driven largely by government policies and tougher emissions laws for the auto industry (SN: 12/22/21). 

    If that surge continues, “electric vehicles offer us the greatest potential [to reduce transportation emissions on land], as long as they’re combined with low or zero carbon electricity sources,” Inger Andersen, the executive director of the United Nations Environment Programme, said at the news event. But for aviation and long-haul shipping, which are more difficult to electrify, reduced carbon emissions could be achieved with low-carbon hydrogen fuels or biofuels, though these alternatives require further research and development.

    Then there are urban areas, which are contributing a growing proportion of global greenhouse gas emissions, from 62 percent in 2015 to between 67 and 72 percent in 2020, the report notes. In established cities, buildings can be retrofitted, renovated or repurposed to make city layouts more walkable and provide more accessible public transportation options. 

    And growing cities can incorporate energy-efficient infrastructure and construct buildings using zero-emissions materials. Additionally, urban planners can take advantage of green roofs, urban forests, rivers and lakes to help capture and store carbon, as well as provide other climate benefits such as cleaner air and local cooling to counter urban heat waves (SN: 4/3/18). 

    Meanwhile, “reducing emissions in industry will involve using materials and energy more efficiently, reusing and recycling products and minimizing waste,” Diana Ürge-Vorsatz, the vice chair of the IPCC’s Working Group III, said at the news event. 

    As for agriculture and forestry, these and other land-use industries contribute about 22 percent of the world’s greenhouse gas emissions, with half of those emissions coming from deforestation (SN: 7/13/21). So reforestation and reduced deforestation are key to flipping the balance between CO₂ emissions and removal from the atmosphere (SN: 7/9/21; SN: 1/3/22). But there are a lot of other strategies that the world can employ at the same time, the report emphasizes. Better management of forests, coastal wetlands, grasslands and other ecosystems, more sustainable crop and livestock management, soil carbon management in agriculture and agroforestry can all bring down emissions (SN: 7/14/21). 

    The report also includes, for the first time in the IPCC’s reports, a chapter on the “untapped potential” of lifestyle changes to reduce emissions. Such changes include opting for walking or cycling or using public transportation rather than driving, shifting toward plant-based diets and reducing air travel (SN: 5/14/20). 

    Those lifestyle changes could reduce emissions by 40 to 70 percent by 2050, the report suggests. To enable those changes, however, government policies, infrastructure and technology would need to be in place. 

    Government policies are also key to financing these transformational changes. Globally, the investment in climate-related technologies needs to ramp up, and quickly, to limit warming below 2 degrees C, the report states. Right now, investments are three to six times lower than they need to be by 2030. And a combination of public and private investments will be essential to aiding the transition away from fossil fuels and toward renewable energy in developing nations (SN: 1/25/21). 

    Future strategies

    Still, reducing emissions alone won’t be enough: We will need to actively remove carbon from the atmosphere to achieve net zero emissions and keep the planet well below 2 degrees C of warming, the report notes. “One thing that’s clear in this report, as opposed to previous reports, is that carbon removal is going to be necessary in the near term,” says Simon Nicholson, director of the Institute for Carbon Removal Law and Policy at American University in Washington, D.C., who was not involved in the report. 

    Such strategies include existing approaches such as protecting or restoring carbon dioxide–absorbing forests, but also technologies that are not yet widely available commercially, such as directly capturing carbon dioxide from the air, or converting the gas to a mineral form and storing it underground (SN: 12/17/18). 

    These options are still in their infancy, and we don’t know how much of an impact they’ll have yet, Nicholson says. “We need massive investment now in research.”

    An emphasis on acting “now,” on eliminating further delay, on the urgency of the moment has been a recurring theme through all three sections of the IPCC’s sixth assessment report released over the last year. What impact these scientists’ stark statements will have is unclear.

    But “the jury has reached a verdict, and it is damning,” U.N. Secretary-General Guterres said. “If you care about justice and our children’s future, I am appealing directly to you.” More