More stories

  • in

    Energy-efficient AI hardware technology via a brain-inspired stashing system?

    Researchers have proposed a novel system inspired by the neuromodulation of the brain, referred to as a ‘stashing system,’ that requires less energy consumption. The research group led by Professor Kyung Min Kim from the Department of Materials Science and Engineering has developed a technology that can efficiently handle mathematical operations for artificial intelligence by imitating the continuous changes in the topology of the neural network according to the situation. The human brain changes its neural topology in real time, learning to store or recall memories as needed. The research group presented a new artificial intelligence learning method that directly implements these neural coordination circuit configurations.
    Research on artificial intelligence is becoming very active, and the development of artificial intelligence-based electronic devices and product releases are accelerating, especially in the Fourth Industrial Revolution age. To implement artificial intelligence in electronic devices, customized hardware development should also be supported. However most electronic devices for artificial intelligence require high power consumption and highly integrated memory arrays for large-scale tasks. It has been challenging to solve these power consumption and integration limitations, and efforts have been made to find out how the human brain solves problems.
    To prove the efficiency of the developed technology, the research group created artificial neural network hardware equipped with a self-rectifying synaptic array and algorithm called a ‘stashing system’ that was developed to conduct artificial intelligence learning. As a result, it was able to reduce energy by 37% within the stashing system without any accuracy degradation. This result proves that emulating the neuromodulation in humans is possible.
    Professor Kim said, “In this study, we implemented the learning method of the human brain with only a simple circuit composition and through this we were able to reduce the energy needed by nearly 40 percent.”
    This neuromodulation-inspired stashing system that mimics the brain’s neural activity is compatible with existing electronic devices and commercialized semiconductor hardware. It is expected to be used in the design of next-generation semiconductor chips for artificial intelligence.
    This study was published in Advanced Functional Materials in March 2022 and supported by KAIST, the National Research Foundation of Korea, the National NanoFab Center, and SK Hynix.
    Story Source:
    Materials provided by The Korea Advanced Institute of Science and Technology (KAIST). Note: Content may be edited for style and length. More

  • in

    Automated platform for plasmid production

    Plasmids have extensive use in basic and applied biology. These small, circular DNA molecules are used by scientists to introduce new genes into a target organism. Well known for their applications in the production of therapeutic proteins like insulin, plasmids are broadly used in the large-scale production of many bioproducts.
    However, designing and constructing plasmids remains one of the most time-consuming and labor-intensive steps in biology research.
    To address this, Behnam Enghiad, Pu Xue, and other University of Illinois Urbana-Champaign researchers at the Center for Advanced Bioenergy and Bioproducts Innovation (CABBI) have developed a versatile and automated platform for plasmid design and construction called PlasmidMaker. Their work was recently published in Nature Communications.
    Creating a plasmid starts with design. To aid in this design process, PlasmidMaker has a user-friendly web interface with which researchers can intuitively visualize and assemble the perfect plasmid for their needs.
    Once the plasmid has been designed, it is submitted to the PlasmidMaker team, and an order for the plasmid is placed at the Illinois Biological Foundry for Advanced Biomanufacturing (iBioFAB), where the plasmid will be built. iBioFAB, located at the Carl R. Woese Institute for Genomic Biology (IGB) on the U of I campus, is a fully integrated computational and physical infrastructure that supports rapid fabrication, quality control, and analysis of genetic constructs. It features a central robotic arm that transfers labware between instruments that perform distinct operations like pipetting, incubation, or thermocycling.
    The plasmid build process is automated: samples are prepared through polymerase chain reaction (PCR) and purification, the DNA sequence is assembled and transformed, and the plasmids are confirmed and frozen, all with little human involvement.
    In addition to the automation and precision afforded by iBioFAB, the PlasmidMaker platform also pioneers a new highly flexible method for assembling multiple DNA fragments into a plasmid using Pyrococcus furiosus Argonaute (PfAgo)-based artificial restriction enzymes (AREs).
    Restriction enzymes have long been used in plasmid construction, as they can cleave DNA molecules at specific sequences of bases, called recognition sequences. However, these recognition sequences are usually short, making them hard to work with. A short sequence is likely to occur multiple times in a DNA molecule, in which case the restriction enzyme would make too many cuts.
    “In previous DNA assembly methods, it would often be hard to find the right restriction enzymes that can cut the plasmid and replace the DNA fragments,” said Huimin Zhao, co-author and the Steven L. Miller Chair of Chemical and Biomolecular Engineering (ChBE) at Illinois. “The PfAgo-based AREs offer greater flexibility and precision, as they can be programmed to seek out longer recognition sequences at virtually any site.”
    With all the improvements it brings to the table, the team members at CABBI, one of four U.S. Department of Energy-funded Bioenergy Research Centers across the United States, hope that PlasmidMaker will accelerate the development of synthetic biology for biotechnological applications.
    “This tool will be available to CABBI researchers, and we want to eventually make it available to all researchers at the other three Bioenergy Research Centers,” Zhao said. “If things go well, we hope to make it available to all researchers everywhere.”
    The manuscript’s other co-authors are Nilmani Singh, CABBI Automation Engineer; Aashutosh Girish Boob and Chengyou Shi, CABBI graduate students in ChBE; Vassily Andrew Petrov, CABBI Software Engineer; Roy Liu, CABBI undergraduate student in Computer Engineering; Siddhartha Suryanarayana Peri, CABBI undergraduate student in ChBE; Stephan Thomas Lane, CABBI iBioFAB Manager; and Emily Danielle Gaither, former CABBI iBioFAB Technician. More

  • in

    Algorithms empower metalens design

    Compact and lightweight metasurfaces — which use specifically designed and patterned nanostructures on a flat surface to focus, shape and control light — are a promising technology for wearable applications, especially virtual and augmented reality systems. Today, research teams painstakingly design the specific pattern of nanostructures on the surface to achieve the desired function of the lens, whether that be resolving nanoscale features, simultaneously producing several depth-perceiving images or focusing light regardless of polarization.
    If the metalens is going to be used commercially in AR and VR systems, it’s going to need to be scaled up significantly, which means the number of nanopillars will be in the billions. How can researchers design something that complex? That’s where artificial intelligence comes in.
    In a recent paper, published in Nature Communications, a team of researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and the Massachusetts Institute of Technology (MIT) described a new method for designing large-scale metasurfaces that uses techniques of machine intelligence to generate designs automatically.
    “This article lays the groundwork and design approach which may influence many real-world devices,” said Federico Capasso, the Robert L. Wallace Professor of Applied Physics and Vinton Hayes Senior Research Fellow in Electrical Engineering at SEAS and senior author of the paper. “Our methods will enable new metasurface designs that can make an impact on virtual or augmented reality, self-driving cars, and machine vision for embarked systems and satellites.”
    Until now, researchers needed years of knowledge and experience in the field to design a metasurface.
    “We’ve been guided by intuition-based design, relying heavily on one’s training in physics, which has been limited in the number of parameters that can be considered simultaneously, bounded as we are by human working memory capacity,” said Zhaoyi Li, a research associate at SEAS and co-lead author of the paper. More

  • in

    Early warning system forecasts who needs critical care for COVID-19

    Scientists have developed and validated an algorithm that can help healthcare professionals identify who is most at risk of dying from COVID-19 when admitted to a hospital, reports a study published today in eLife.
    The tool, which uses artificial intelligence (AI), could help doctors direct critical care resources to those who need them most, and will be especially valuable to resource-limited countries.
    “The appearance of new SARS-CoV-2 variants, waning immune protection and relaxation of mitigation measures means we are likely to continue seeing surges of infections and hospitalisations,” explains the leader of this international project and senior author David Gómez-Varela, former Max Planck Group Leader and current Senior Scientist at the Division of Pharmacology and Toxicology, University of Vienna, Austria. “There is a need for clinically valuable and generalisable triage tools to assist the allocation of hospital resources for COVID-19, particularly in places where resources are scarce. But these tools need to be able to cope with the ever-changing scenario of a global pandemic and must be easy to implement.”
    To develop such a tool, the team used biochemical data from routine blood draws performed on nearly 30,000 patients hospitalised in over 150 hospitals in Spain, the US, Honduras, Bolivia and Argentina between March 2020 and February 2022. This means they were able to capture data from people with different immune statuses — vaccinated, unvaccinated and those with natural immunity — and from people infected with every SARS-CoV-2 variant, from the virus that emerged in Wuhan, China, to the latest Omicron variant. “The intrinsic variability in such a diverse dataset is a great challenge for AI-based prediction models,” says lead author Riku Klén, Associate Professor at the University of Turku, Finland.
    The resulting algorithm — called COVID-19 Disease Outcome Predictor (CODOP) — uses measurements of 12 blood molecules that are normally collected during admission. This means the predictive tool can be easily integrated into the clinical care of any hospital.
    CODOP was developed in a multistep process, initially using data from patients hospitalised in more than 120 hospitals in Spain, to ‘train’ the AI system to predict hallmarks of a poor prognosis.
    The next step was to ensure the tool worked regardless of patients’ immune status or COVID-19 variant, so they tested the algorithm in several subgroups of geographically dispersed patients. The tool still performed well at predicting the risk of in-hospital death during this fluctuating scenario of the pandemic, suggesting the measurements CODOP is based on are truly meaningful biomarkers of whether a patient with COVID-19 is likely to deteriorate.
    To test whether the time of taking blood tests affects the tool’s performance, the team compared data from different time points of blood drawn before patients either recovered or died. They found that the algorithm can predict the survival or death of hospitalised patients with high accuracy until nine days before either outcome occurs.
    Finally, they created two different versions of the tool for use in scenarios where healthcare resources are either operating normally or are under severe pressure. Under normal operational burden, doctors may opt to use an ‘overtriage’ version, which is highly sensitive at picking up people at increased risk of death, at the expense of detecting some people who did not require critical care. The alternative ‘undertriage’ model minimises the possibility of wrongly selecting people at lower risk of dying, providing doctors with greater certainty that they are directing care to those at the highest risk when resources are severely limited.
    “The performance of CODOP in diverse and geographically dispersed patient groups and the ease of use suggest it could be a valuable tool in the clinic, especially in resource-limited countries,” remarks Gómez-Varela. “We are now working on a follow-up dual model tailored to the current pandemic scenario of increasing infections and cumulative immune protection, which will predict the need for hospitalisation within 24 hours for patients within primary care, and intensive care admission within 48 hours for those already hospitalised. We hope to help healthcare systems restore previous standards of routine care before the pandemic took hold.”
    The CODOP predictor is freely accessible at: https://gomezvarelalab.em.mpg.de/codop/
    Story Source:
    Materials provided by eLife. Note: Content may be edited for style and length. More

  • in

    New silicon nanowires can really take the heat

    Scientists have demonstrated a new material that conducts heat 150% more efficiently than conventional materials used in advanced chip technologies.
    The device — an ultrathin silicon nanowire — could enable smaller, faster microelectronics with a heat-transfer-efficiency that surpasses current technologies. Electronic devices powered by microchips that efficiently dissipate heat would in turn consume less energy — an improvement that could help mitigate the consumption of energy produced by burning carbon-rich fossil fuels that have contributed to global warming.
    “By overcoming silicon’s natural limitations in its capacity to conduct heat, our discovery tackles a hurdle in microchip engineering,” said Junqiao Wu, the scientist who led the Physical Review Letters study reporting the new device. Wu is a faculty scientist in the Materials Sciences Division and professor of materials science and engineering at UC Berkeley.
    Heat’s slow flow through silicon
    Our electronics are relatively affordable because silicon — the material of choice for computer chips — is cheap and abundant. But although silicon is a good conductor of electricity, it is not a good conductor of heat when it is reduced to very small sizes — and when it comes to fast computing, that presents a big problem for tiny microchips.
    Within each microchip resides tens of billions of silicon transistors that direct the flow of electrons in and out of memory cells, encoding bits of data as ones and zeroes, the binary language of computers. Electrical currents run between these hard-working transistors, and these currents inevitably generate heat. More

  • in

    The numbers don't lie: Australia is failing at maths and we need to find a new formula to arrest the decline

    Divide, subtract, add, multiply: whatever way you cut it, Australia is heading in one direction when it comes to global maths rankings — downwards.
    From an OECD mathematics ranking of 11 in the world 20 years ago, Australian secondary students are now languishing in 29th place out of 38 countries, according to the most recent statistics.
    The sliding maths rankings have created widespread debate over whether curriculum changes are needed in our schools, but a new international paper co-authored by University of South Australia cognitive psychologist Dr Fernando Marmolejo-Ramos could provide part of the solution.
    In the latest edition of Integrative Psychology and Behavioural Science, Dr Marmolejo-Ramos and researchers from China and Iran explain why simple gestures such as hand motions are important in helping students understand mathematical concepts.
    “Many people struggle with mathematics and there is a lot of anxiety around it because it is an abstract topic,” Dr Marmolejo-Ramos says. “You see the numbers, equations and graphs, but unless you engage human motor and sensory skills, they can be very difficult to grasp.”
    To get maths concepts across, it is important to bring together language, speech intonation, facial expressions and hand gestures, particularly the latter, the researchers say. More

  • in

    New method melds data to make a 3-D map of cells' activities

    Just as it’s hard to understand a conversation without knowing its context, it can be difficult for biologists to grasp the significance of gene expression without knowing a cell’s environment. To solve that problem, researchers at Princeton Engineering have developed a method to elucidate a cell’s surroundings so that biologists can make more meaning of gene expression information.
    The researchers, led by Professor of Computer Science Ben Raphael, hope the new system will open the door to identifying rare cell types and choosing cancer treatment options with new precision. Raphael is the senior author of a paper describing the method published May 16 in Nature Methods.
    The basic technique of linking gene expression with a cell’s environment, called spatial transcriptomics (ST), has been around for several years. Scientists break down tissue samples onto a microscale grid and link each spot on the grid with information about gene expression. The problem is that current computational tools can only analyze spatial patterns of gene expression in two dimensions. Experiments that use multiple slices from a single tissue sample — such as a region of a brain, heart or tumor — are difficult to synthesize into a complete picture of the cell types in the tissue.
    The Princeton researchers’ method, called PASTE (for Probabilistic Alignment of ST Experiments), integrates information from multiple slices taken from the same tissue sample, providing a three-dimensional view of gene expression within a tumor or a developing organ. When sequence coverage in an experiment is limited due to technical or cost issues, PASTE can also merge information from multiple tissue slices into a single two-dimensional consensus slice with richer gene expression information.
    “Our method was motivated by the observation that oftentimes biologists will perform multiple experiments from the same tissue,” said Raphael. “Now, these replicate experiments are not exactly the same cells, but they’re from the same tissue and therefore should be highly similar.”
    The team’s technique can align multiple slices from a single tissue sample, categorizing cells based on their gene expression profiles while preserving the physical location of the cells within the tissue. More

  • in

    Scientists identify characteristics to better define long COVID

    A research team supported by the National Institutes of Health has identified characteristics of people with long COVID and those likely to have it. Scientists, using machine learning techniques, analyzed an unprecedented collection of electronic health records (EHRs) available for COVID-19 research to better identify who has long COVID. Exploring de-identified EHR data in the National COVID Cohort Collaborative (N3C), a national, centralized public database led by NIH’s National Center for Advancing Translational Sciences (NCATS), the team used the data to find more than 100,000 likely long COVID cases as of October 2021 (as of May 2022, the count is more than 200,000). The findings appeared May 16 in The Lancet Digital Health.
    Long COVID is marked by wide-ranging symptoms, including shortness of breath, fatigue, fever, headaches, “brain fog” and other neurological problems. Such symptoms can last for many months or longer after an initial COVID-19 diagnosis. One reason long COVID is difficult to identify is that many of its symptoms are similar to those of other diseases and conditions. A better characterization of long COVID could lead to improved diagnoses and new therapeutic approaches.
    “It made sense to take advantage of modern data analysis tools and a unique big data resource like N3C, where many features of long COVID can be represented,” said co-author Emily Pfaff, Ph.D., a clinical informaticist at the University of North Carolina at Chapel Hill.
    The N3C data enclave currently includes information representing more than 13 million people nationwide, including nearly 5 million COVID-19-positive cases. The resource enables rapid research on emerging questions about COVID-19 vaccines, therapies, risk factors and health outcomes.
    The new research is part of a related, larger trans-NIH initiative, Researching COVID to Enhance Recovery (RECOVER), which aims to improve the understanding of the long-term effects of COVID-19, called post-acute sequelae of SARS-CoV-2 infection (PASC). RECOVER will accurately identify people with PASC and develop approaches for its prevention and treatment. The program also will answer critical research questions about the long-term effects of COVID through clinical trials, longitudinal observational studies, and more.
    In the Lancet study, Pfaff, Melissa Haendel, Ph.D., at the University of Colorado Anschutz Medical Campus, and their colleagues examined patient demographics, health care use, diagnoses and medications in the health records of 97,995 adult COVID-19 patients in the N3C. They used this information, along with data on nearly 600 long COVID patients from three long COVID clinics, to create three machine learning models to identify long COVID patients.
    In machine learning, scientists “train” computational methods to rapidly sift through large amounts of data to reveal new insights — in this case, about long COVID. The models looked for patterns in the data that could help researchers both understand patient characteristics and better identify individuals with the condition.
    The models focused on identifying potential long COVID patients among three groups in the N3C database: All COVID-19 patients, patients hospitalized with COVID-19, and patients who had COVID-19 but were not hospitalized. The models proved to be accurate, as people identified as at risk for long COVID were similar to patients seen at long COVID clinics. The machine learning systems classified approximately 100,000 patients in the N3C database whose profiles were close matches to those with long COVID.
    “Once you’re able to determine who has long COVID in a large database of people, you can begin to ask questions about those people,” said Josh Fessel, M.D., Ph.D., senior clinical advisor at NCATS and a scientific program lead in RECOVER. “Was there something different about those people before they developed long COVID? Did they have certain risk factors? Was there something about how they were treated during acute COVID that might have increased or decreased their risk for long COVID?”
    The models searched for common features, including new medications, doctor visits and new symptoms, in patients with a positive COVID diagnosis who were at least 90 days out from their acute infection. The models identified patients as having long COVID if they went to a long COVID clinic or demonstrated long COVID symptoms and likely had the condition but hadn’t been diagnosed.
    “We want to incorporate the new patterns we’re seeing with the diagnosis code for COVID and include it in our models to try to improve their performance,” said the University of Colorado’s Haendel. “The models can learn from a greater variety of patients and become more accurate. We hope we can use our long COVID patient classifier for clinical trial recruitment.”
    This study was funded by NCATS, which contributed to the design, maintenance and security of the N3C Enclave, and the NIH RECOVER Initiative, supported by NIH OT2HL161847. RECOVER is coordinating, among others, the participant recruitment protocol to which this work contributes. The analyses were conducted with data and tools accessed through the NCATS N3C Data Enclave and supported by NCATS U24TR002306. More