More stories

  • in

    Teaching physics to AI makes the student a master

    Researchers at Duke University have demonstrated that incorporating known physics into machine learning algorithms can help the inscrutable black boxes attain new levels of transparency and insight into material properties.
    In one of the first projects of its kind, researchers constructed a modern machine learning algorithm to determine the properties of a class of engineered materials known as metamaterials and to predict how they interact with electromagnetic fields.
    Because it first had to consider the metamaterial’s known physical constraints, the program was essentially forced to show its work. Not only did the approach allow the algorithm to accurately predict the metamaterial’s properties, it did so more efficiently than previous methods while providing new insights.
    The results appear online the week of May 9 in the journal Advanced Optical Materials.
    “By incorporating known physics directly into the machine learning, the algorithm can find solutions with less training data and in less time,” said Willie Padilla, professor of electrical and computer engineering at Duke. “While this study was mainly a demonstration showing that the approach could recreate known solutions, it also revealed some insights into the inner workings of non-metallic metamaterials that nobody knew before.”
    Metamaterials are synthetic materials composed of many individual engineered features, which together produce properties not found in nature through their structure rather than their chemistry. In this case, the metamaterial consists of a large grid of silicon cylinders that resemble a Lego baseplate. More

  • in

    Researchers create photonic materials for powerful, efficient light-based computing

    University of Central Florida researchers are developing new photonic materials that could one day help enable low power, ultra-fast, light-based computing.
    The unique materials, known as topological insulators, are like wires that have been turned inside out, where the current runs along the outside and the interior is insulated.
    Topological insulators are important because they could be used in circuit designs that allow for more processing power to be crammed into a single space without generating heat, thus avoiding the overheating problem today’s smaller and smaller circuits face.
    In their latest work, published in the journal Nature Materials, the researchers demonstrated a new approach to create the materials that uses a novel, chained, honeycomb lattice design.
    The researchers laser etched the chained, honeycombed design onto a sample of silica, the material commonly used to make photonic circuits.
    Nodes in the design allow the researchers to modulate the current without bending or stretching the photonic wires, an essential feature needed for controlling the flow of light and thus information in a circuit. More

  • in

    New model could improve matches between students and schools

    For the majority of students in the U.S., residential addresses determine which public elementary, middle, or high school they attend. But with an influx of charter schools and state-funded voucher programs for private schools, as well as a growing number of cities that let students apply to public schools across the district (regardless of zip code), the admissions process can turn into a messy game of matchmaking.
    Simultaneous applications for competitive spots and a lack of coordination among school authorities often result in some students being matched with multiple schools while others are unassigned. It can lead to unfilled seats at the start of the semester and extra stress for students and parents, as well as teachers and administrators.
    Assistant Professor of Economics Bertan Turhan at Iowa State University and his co-authors outline a way to make better, more efficient matches between students and schools in their new study published in Games and Economic Behavior. Turhan says their goal was to create a fairer process that works within realistic parameters.
    “There are a lot of success stories in major U.S. cities where economists and policymakers worked together to improve school choice,” said Turhan. “The algorithm we introduced builds on that and could give school groups some degree of coordination and significantly increase overall student welfare in situations where there’s a lot of competition to get into certain schools.”
    A new matchmaking model
    Using the researchers’ model, each student or family submits one rank-ordered list of public schools to the public school district and another rank-ordered list of private schools to the voucher program. Each school also submits a ranking of students to either the public school district or voucher program. More

  • in

    Energy-efficient AI hardware technology via a brain-inspired stashing system?

    Researchers have proposed a novel system inspired by the neuromodulation of the brain, referred to as a ‘stashing system,’ that requires less energy consumption. The research group led by Professor Kyung Min Kim from the Department of Materials Science and Engineering has developed a technology that can efficiently handle mathematical operations for artificial intelligence by imitating the continuous changes in the topology of the neural network according to the situation. The human brain changes its neural topology in real time, learning to store or recall memories as needed. The research group presented a new artificial intelligence learning method that directly implements these neural coordination circuit configurations.
    Research on artificial intelligence is becoming very active, and the development of artificial intelligence-based electronic devices and product releases are accelerating, especially in the Fourth Industrial Revolution age. To implement artificial intelligence in electronic devices, customized hardware development should also be supported. However most electronic devices for artificial intelligence require high power consumption and highly integrated memory arrays for large-scale tasks. It has been challenging to solve these power consumption and integration limitations, and efforts have been made to find out how the human brain solves problems.
    To prove the efficiency of the developed technology, the research group created artificial neural network hardware equipped with a self-rectifying synaptic array and algorithm called a ‘stashing system’ that was developed to conduct artificial intelligence learning. As a result, it was able to reduce energy by 37% within the stashing system without any accuracy degradation. This result proves that emulating the neuromodulation in humans is possible.
    Professor Kim said, “In this study, we implemented the learning method of the human brain with only a simple circuit composition and through this we were able to reduce the energy needed by nearly 40 percent.”
    This neuromodulation-inspired stashing system that mimics the brain’s neural activity is compatible with existing electronic devices and commercialized semiconductor hardware. It is expected to be used in the design of next-generation semiconductor chips for artificial intelligence.
    This study was published in Advanced Functional Materials in March 2022 and supported by KAIST, the National Research Foundation of Korea, the National NanoFab Center, and SK Hynix.
    Story Source:
    Materials provided by The Korea Advanced Institute of Science and Technology (KAIST). Note: Content may be edited for style and length. More

  • in

    Automated platform for plasmid production

    Plasmids have extensive use in basic and applied biology. These small, circular DNA molecules are used by scientists to introduce new genes into a target organism. Well known for their applications in the production of therapeutic proteins like insulin, plasmids are broadly used in the large-scale production of many bioproducts.
    However, designing and constructing plasmids remains one of the most time-consuming and labor-intensive steps in biology research.
    To address this, Behnam Enghiad, Pu Xue, and other University of Illinois Urbana-Champaign researchers at the Center for Advanced Bioenergy and Bioproducts Innovation (CABBI) have developed a versatile and automated platform for plasmid design and construction called PlasmidMaker. Their work was recently published in Nature Communications.
    Creating a plasmid starts with design. To aid in this design process, PlasmidMaker has a user-friendly web interface with which researchers can intuitively visualize and assemble the perfect plasmid for their needs.
    Once the plasmid has been designed, it is submitted to the PlasmidMaker team, and an order for the plasmid is placed at the Illinois Biological Foundry for Advanced Biomanufacturing (iBioFAB), where the plasmid will be built. iBioFAB, located at the Carl R. Woese Institute for Genomic Biology (IGB) on the U of I campus, is a fully integrated computational and physical infrastructure that supports rapid fabrication, quality control, and analysis of genetic constructs. It features a central robotic arm that transfers labware between instruments that perform distinct operations like pipetting, incubation, or thermocycling.
    The plasmid build process is automated: samples are prepared through polymerase chain reaction (PCR) and purification, the DNA sequence is assembled and transformed, and the plasmids are confirmed and frozen, all with little human involvement.
    In addition to the automation and precision afforded by iBioFAB, the PlasmidMaker platform also pioneers a new highly flexible method for assembling multiple DNA fragments into a plasmid using Pyrococcus furiosus Argonaute (PfAgo)-based artificial restriction enzymes (AREs).
    Restriction enzymes have long been used in plasmid construction, as they can cleave DNA molecules at specific sequences of bases, called recognition sequences. However, these recognition sequences are usually short, making them hard to work with. A short sequence is likely to occur multiple times in a DNA molecule, in which case the restriction enzyme would make too many cuts.
    “In previous DNA assembly methods, it would often be hard to find the right restriction enzymes that can cut the plasmid and replace the DNA fragments,” said Huimin Zhao, co-author and the Steven L. Miller Chair of Chemical and Biomolecular Engineering (ChBE) at Illinois. “The PfAgo-based AREs offer greater flexibility and precision, as they can be programmed to seek out longer recognition sequences at virtually any site.”
    With all the improvements it brings to the table, the team members at CABBI, one of four U.S. Department of Energy-funded Bioenergy Research Centers across the United States, hope that PlasmidMaker will accelerate the development of synthetic biology for biotechnological applications.
    “This tool will be available to CABBI researchers, and we want to eventually make it available to all researchers at the other three Bioenergy Research Centers,” Zhao said. “If things go well, we hope to make it available to all researchers everywhere.”
    The manuscript’s other co-authors are Nilmani Singh, CABBI Automation Engineer; Aashutosh Girish Boob and Chengyou Shi, CABBI graduate students in ChBE; Vassily Andrew Petrov, CABBI Software Engineer; Roy Liu, CABBI undergraduate student in Computer Engineering; Siddhartha Suryanarayana Peri, CABBI undergraduate student in ChBE; Stephan Thomas Lane, CABBI iBioFAB Manager; and Emily Danielle Gaither, former CABBI iBioFAB Technician. More

  • in

    Algorithms empower metalens design

    Compact and lightweight metasurfaces — which use specifically designed and patterned nanostructures on a flat surface to focus, shape and control light — are a promising technology for wearable applications, especially virtual and augmented reality systems. Today, research teams painstakingly design the specific pattern of nanostructures on the surface to achieve the desired function of the lens, whether that be resolving nanoscale features, simultaneously producing several depth-perceiving images or focusing light regardless of polarization.
    If the metalens is going to be used commercially in AR and VR systems, it’s going to need to be scaled up significantly, which means the number of nanopillars will be in the billions. How can researchers design something that complex? That’s where artificial intelligence comes in.
    In a recent paper, published in Nature Communications, a team of researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and the Massachusetts Institute of Technology (MIT) described a new method for designing large-scale metasurfaces that uses techniques of machine intelligence to generate designs automatically.
    “This article lays the groundwork and design approach which may influence many real-world devices,” said Federico Capasso, the Robert L. Wallace Professor of Applied Physics and Vinton Hayes Senior Research Fellow in Electrical Engineering at SEAS and senior author of the paper. “Our methods will enable new metasurface designs that can make an impact on virtual or augmented reality, self-driving cars, and machine vision for embarked systems and satellites.”
    Until now, researchers needed years of knowledge and experience in the field to design a metasurface.
    “We’ve been guided by intuition-based design, relying heavily on one’s training in physics, which has been limited in the number of parameters that can be considered simultaneously, bounded as we are by human working memory capacity,” said Zhaoyi Li, a research associate at SEAS and co-lead author of the paper. More

  • in

    Early warning system forecasts who needs critical care for COVID-19

    Scientists have developed and validated an algorithm that can help healthcare professionals identify who is most at risk of dying from COVID-19 when admitted to a hospital, reports a study published today in eLife.
    The tool, which uses artificial intelligence (AI), could help doctors direct critical care resources to those who need them most, and will be especially valuable to resource-limited countries.
    “The appearance of new SARS-CoV-2 variants, waning immune protection and relaxation of mitigation measures means we are likely to continue seeing surges of infections and hospitalisations,” explains the leader of this international project and senior author David Gómez-Varela, former Max Planck Group Leader and current Senior Scientist at the Division of Pharmacology and Toxicology, University of Vienna, Austria. “There is a need for clinically valuable and generalisable triage tools to assist the allocation of hospital resources for COVID-19, particularly in places where resources are scarce. But these tools need to be able to cope with the ever-changing scenario of a global pandemic and must be easy to implement.”
    To develop such a tool, the team used biochemical data from routine blood draws performed on nearly 30,000 patients hospitalised in over 150 hospitals in Spain, the US, Honduras, Bolivia and Argentina between March 2020 and February 2022. This means they were able to capture data from people with different immune statuses — vaccinated, unvaccinated and those with natural immunity — and from people infected with every SARS-CoV-2 variant, from the virus that emerged in Wuhan, China, to the latest Omicron variant. “The intrinsic variability in such a diverse dataset is a great challenge for AI-based prediction models,” says lead author Riku Klén, Associate Professor at the University of Turku, Finland.
    The resulting algorithm — called COVID-19 Disease Outcome Predictor (CODOP) — uses measurements of 12 blood molecules that are normally collected during admission. This means the predictive tool can be easily integrated into the clinical care of any hospital.
    CODOP was developed in a multistep process, initially using data from patients hospitalised in more than 120 hospitals in Spain, to ‘train’ the AI system to predict hallmarks of a poor prognosis.
    The next step was to ensure the tool worked regardless of patients’ immune status or COVID-19 variant, so they tested the algorithm in several subgroups of geographically dispersed patients. The tool still performed well at predicting the risk of in-hospital death during this fluctuating scenario of the pandemic, suggesting the measurements CODOP is based on are truly meaningful biomarkers of whether a patient with COVID-19 is likely to deteriorate.
    To test whether the time of taking blood tests affects the tool’s performance, the team compared data from different time points of blood drawn before patients either recovered or died. They found that the algorithm can predict the survival or death of hospitalised patients with high accuracy until nine days before either outcome occurs.
    Finally, they created two different versions of the tool for use in scenarios where healthcare resources are either operating normally or are under severe pressure. Under normal operational burden, doctors may opt to use an ‘overtriage’ version, which is highly sensitive at picking up people at increased risk of death, at the expense of detecting some people who did not require critical care. The alternative ‘undertriage’ model minimises the possibility of wrongly selecting people at lower risk of dying, providing doctors with greater certainty that they are directing care to those at the highest risk when resources are severely limited.
    “The performance of CODOP in diverse and geographically dispersed patient groups and the ease of use suggest it could be a valuable tool in the clinic, especially in resource-limited countries,” remarks Gómez-Varela. “We are now working on a follow-up dual model tailored to the current pandemic scenario of increasing infections and cumulative immune protection, which will predict the need for hospitalisation within 24 hours for patients within primary care, and intensive care admission within 48 hours for those already hospitalised. We hope to help healthcare systems restore previous standards of routine care before the pandemic took hold.”
    The CODOP predictor is freely accessible at: https://gomezvarelalab.em.mpg.de/codop/
    Story Source:
    Materials provided by eLife. Note: Content may be edited for style and length. More

  • in

    New silicon nanowires can really take the heat

    Scientists have demonstrated a new material that conducts heat 150% more efficiently than conventional materials used in advanced chip technologies.
    The device — an ultrathin silicon nanowire — could enable smaller, faster microelectronics with a heat-transfer-efficiency that surpasses current technologies. Electronic devices powered by microchips that efficiently dissipate heat would in turn consume less energy — an improvement that could help mitigate the consumption of energy produced by burning carbon-rich fossil fuels that have contributed to global warming.
    “By overcoming silicon’s natural limitations in its capacity to conduct heat, our discovery tackles a hurdle in microchip engineering,” said Junqiao Wu, the scientist who led the Physical Review Letters study reporting the new device. Wu is a faculty scientist in the Materials Sciences Division and professor of materials science and engineering at UC Berkeley.
    Heat’s slow flow through silicon
    Our electronics are relatively affordable because silicon — the material of choice for computer chips — is cheap and abundant. But although silicon is a good conductor of electricity, it is not a good conductor of heat when it is reduced to very small sizes — and when it comes to fast computing, that presents a big problem for tiny microchips.
    Within each microchip resides tens of billions of silicon transistors that direct the flow of electrons in and out of memory cells, encoding bits of data as ones and zeroes, the binary language of computers. Electrical currents run between these hard-working transistors, and these currents inevitably generate heat. More